Mar 12 18:02:38 crc systemd[1]: Starting Kubernetes Kubelet... Mar 12 18:02:38 crc restorecon[4751]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:38 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 18:02:39 crc restorecon[4751]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 12 18:02:40 crc kubenswrapper[4926]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 18:02:40 crc kubenswrapper[4926]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 18:02:40 crc kubenswrapper[4926]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 18:02:40 crc kubenswrapper[4926]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 18:02:40 crc kubenswrapper[4926]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 12 18:02:40 crc kubenswrapper[4926]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.198483 4926 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203485 4926 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203515 4926 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203525 4926 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203535 4926 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203548 4926 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203557 4926 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203568 4926 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203577 4926 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203585 4926 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203593 4926 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203601 4926 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203609 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203631 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203639 4926 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203647 4926 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203657 4926 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203665 4926 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203673 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203681 4926 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203689 4926 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203696 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203704 4926 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203712 4926 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203720 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203727 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203735 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203743 4926 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203750 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203758 4926 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203766 4926 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203773 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203781 4926 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203791 4926 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203802 4926 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203811 4926 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203819 4926 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203830 4926 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203840 4926 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203849 4926 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203857 4926 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203866 4926 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203875 4926 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203885 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203894 4926 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203901 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203910 4926 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203920 4926 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203931 4926 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203940 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203948 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203957 4926 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203966 4926 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203975 4926 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203986 4926 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.203994 4926 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204002 4926 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204010 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204017 4926 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204025 4926 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204033 4926 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204040 4926 feature_gate.go:330] unrecognized feature gate: Example Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204048 4926 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204055 4926 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204063 4926 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204071 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204078 4926 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204086 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204094 4926 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204101 4926 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204109 4926 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.204117 4926 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204263 4926 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204279 4926 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204294 4926 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204306 4926 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204318 4926 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204327 4926 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204339 4926 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204351 4926 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204362 4926 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204371 4926 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204380 4926 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204389 4926 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204398 4926 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204408 4926 flags.go:64] FLAG: --cgroup-root="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204417 4926 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204425 4926 flags.go:64] FLAG: --client-ca-file="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204434 4926 flags.go:64] FLAG: --cloud-config="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204469 4926 flags.go:64] FLAG: --cloud-provider="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204478 4926 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204490 4926 flags.go:64] FLAG: --cluster-domain="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204499 4926 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204508 4926 flags.go:64] FLAG: --config-dir="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204517 4926 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204527 4926 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204538 4926 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204547 4926 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204556 4926 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204565 4926 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204574 4926 flags.go:64] FLAG: --contention-profiling="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204583 4926 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204592 4926 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204602 4926 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204611 4926 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204621 4926 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204630 4926 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204640 4926 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204648 4926 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204657 4926 flags.go:64] FLAG: --enable-server="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204666 4926 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204678 4926 flags.go:64] FLAG: --event-burst="100" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204689 4926 flags.go:64] FLAG: --event-qps="50" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204699 4926 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204708 4926 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204718 4926 flags.go:64] FLAG: --eviction-hard="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204746 4926 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204755 4926 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204765 4926 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204774 4926 flags.go:64] FLAG: --eviction-soft="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204783 4926 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204797 4926 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204807 4926 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204816 4926 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204824 4926 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204833 4926 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204842 4926 flags.go:64] FLAG: --feature-gates="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204853 4926 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204862 4926 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204871 4926 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204880 4926 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204889 4926 flags.go:64] FLAG: --healthz-port="10248" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204899 4926 flags.go:64] FLAG: --help="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204908 4926 flags.go:64] FLAG: --hostname-override="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204917 4926 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204926 4926 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204934 4926 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204944 4926 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204953 4926 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204961 4926 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204970 4926 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204980 4926 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.204990 4926 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205000 4926 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205011 4926 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205020 4926 flags.go:64] FLAG: --kube-reserved="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205029 4926 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205037 4926 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205047 4926 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205056 4926 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205064 4926 flags.go:64] FLAG: --lock-file="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205074 4926 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205083 4926 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205094 4926 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205107 4926 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205116 4926 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205125 4926 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205134 4926 flags.go:64] FLAG: --logging-format="text" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205143 4926 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205153 4926 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205162 4926 flags.go:64] FLAG: --manifest-url="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205170 4926 flags.go:64] FLAG: --manifest-url-header="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205181 4926 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205190 4926 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205201 4926 flags.go:64] FLAG: --max-pods="110" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205210 4926 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205219 4926 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205228 4926 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205237 4926 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205247 4926 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205256 4926 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205265 4926 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205282 4926 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205292 4926 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205302 4926 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205311 4926 flags.go:64] FLAG: --pod-cidr="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205320 4926 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205332 4926 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205341 4926 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205350 4926 flags.go:64] FLAG: --pods-per-core="0" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205359 4926 flags.go:64] FLAG: --port="10250" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205369 4926 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205377 4926 flags.go:64] FLAG: --provider-id="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205386 4926 flags.go:64] FLAG: --qos-reserved="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205395 4926 flags.go:64] FLAG: --read-only-port="10255" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205404 4926 flags.go:64] FLAG: --register-node="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205413 4926 flags.go:64] FLAG: --register-schedulable="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205423 4926 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205458 4926 flags.go:64] FLAG: --registry-burst="10" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205467 4926 flags.go:64] FLAG: --registry-qps="5" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205477 4926 flags.go:64] FLAG: --reserved-cpus="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205486 4926 flags.go:64] FLAG: --reserved-memory="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205496 4926 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205505 4926 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205514 4926 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205523 4926 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205532 4926 flags.go:64] FLAG: --runonce="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205542 4926 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205551 4926 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205561 4926 flags.go:64] FLAG: --seccomp-default="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205570 4926 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205579 4926 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205588 4926 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205598 4926 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205607 4926 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205616 4926 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205625 4926 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205634 4926 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205643 4926 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205652 4926 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205662 4926 flags.go:64] FLAG: --system-cgroups="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205671 4926 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205686 4926 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205695 4926 flags.go:64] FLAG: --tls-cert-file="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205704 4926 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205714 4926 flags.go:64] FLAG: --tls-min-version="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205723 4926 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205731 4926 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205740 4926 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205749 4926 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205758 4926 flags.go:64] FLAG: --v="2" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205769 4926 flags.go:64] FLAG: --version="false" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205780 4926 flags.go:64] FLAG: --vmodule="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205791 4926 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.205801 4926 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206012 4926 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206025 4926 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206033 4926 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206043 4926 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206051 4926 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206060 4926 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206069 4926 feature_gate.go:330] unrecognized feature gate: Example Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206077 4926 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206085 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206092 4926 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206100 4926 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206108 4926 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206116 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206124 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206132 4926 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206140 4926 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206149 4926 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206157 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206167 4926 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206178 4926 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206187 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206196 4926 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206205 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206216 4926 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206225 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206234 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206246 4926 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206255 4926 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206264 4926 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206272 4926 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206280 4926 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206290 4926 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206299 4926 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206307 4926 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206316 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206325 4926 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206333 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206341 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206350 4926 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206358 4926 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206366 4926 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206374 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206382 4926 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206390 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206397 4926 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206405 4926 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206412 4926 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206423 4926 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206433 4926 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206465 4926 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206473 4926 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206481 4926 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206489 4926 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206497 4926 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206504 4926 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206512 4926 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206520 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206528 4926 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206538 4926 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206547 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206556 4926 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206566 4926 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206574 4926 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206582 4926 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206590 4926 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206598 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206606 4926 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206613 4926 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206621 4926 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206630 4926 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.206638 4926 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.207492 4926 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.234216 4926 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.234280 4926 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234406 4926 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234429 4926 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234468 4926 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234478 4926 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234487 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234495 4926 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234503 4926 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234514 4926 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234527 4926 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234537 4926 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234546 4926 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234555 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234563 4926 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234572 4926 feature_gate.go:330] unrecognized feature gate: Example Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234579 4926 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234590 4926 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234601 4926 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234611 4926 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234619 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234627 4926 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234638 4926 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234648 4926 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234658 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234667 4926 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234675 4926 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234684 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234693 4926 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234701 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234709 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234718 4926 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234727 4926 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234736 4926 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234743 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234751 4926 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234760 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234768 4926 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234777 4926 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234786 4926 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234794 4926 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234803 4926 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234811 4926 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234818 4926 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234826 4926 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234835 4926 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234844 4926 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234853 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234881 4926 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234889 4926 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234897 4926 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234905 4926 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234912 4926 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234920 4926 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234928 4926 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234936 4926 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234943 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234951 4926 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234959 4926 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234967 4926 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234975 4926 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234983 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234992 4926 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.234999 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235010 4926 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235018 4926 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235026 4926 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235034 4926 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235042 4926 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235050 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235057 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235065 4926 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235073 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.235086 4926 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235318 4926 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235333 4926 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235343 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235352 4926 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235363 4926 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235371 4926 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235380 4926 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235390 4926 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235398 4926 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235406 4926 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235414 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235422 4926 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235429 4926 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235460 4926 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235469 4926 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235477 4926 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235487 4926 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235498 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235508 4926 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235516 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235524 4926 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235532 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235539 4926 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235548 4926 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235556 4926 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235564 4926 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235571 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235579 4926 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235586 4926 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235594 4926 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235602 4926 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235610 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235618 4926 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235627 4926 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235635 4926 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235642 4926 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235650 4926 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235659 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235667 4926 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235675 4926 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235683 4926 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235693 4926 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235703 4926 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235712 4926 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235721 4926 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235730 4926 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235738 4926 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235746 4926 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235753 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235761 4926 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235771 4926 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235781 4926 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235790 4926 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235798 4926 feature_gate.go:330] unrecognized feature gate: Example Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235807 4926 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235817 4926 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235825 4926 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235833 4926 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235842 4926 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235850 4926 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235859 4926 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235867 4926 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235877 4926 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235887 4926 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235898 4926 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235906 4926 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235915 4926 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235923 4926 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235932 4926 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235939 4926 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.235947 4926 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.235961 4926 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.236866 4926 server.go:940] "Client rotation is on, will bootstrap in background" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.243574 4926 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.248359 4926 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.248566 4926 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.250728 4926 server.go:997] "Starting client certificate rotation" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.250779 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.251004 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.279316 4926 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.281719 4926 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.282878 4926 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.300638 4926 log.go:25] "Validated CRI v1 runtime API" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.338517 4926 log.go:25] "Validated CRI v1 image API" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.340694 4926 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.346503 4926 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-12-17-57-17-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.346548 4926 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.393082 4926 manager.go:217] Machine: {Timestamp:2026-03-12 18:02:40.390366954 +0000 UTC m=+0.758993357 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9f4a0cfb-e2ee-40d1-a613-eac4618fc62c BootID:2090c8b2-af81-407e-bc9b-78510eed61ed Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2f:4c:47 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2f:4c:47 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:48:34:3c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:dd:8f:93 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:be:af:fb Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:15:f0:11 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:c5:9b:83 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:ed:ff:ef:1e:67 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:0b:7a:3f:72:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.393922 4926 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.394309 4926 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.395790 4926 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.396223 4926 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.396408 4926 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.398596 4926 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.398692 4926 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.399549 4926 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.399648 4926 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.400081 4926 state_mem.go:36] "Initialized new in-memory state store" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.400681 4926 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.404067 4926 kubelet.go:418] "Attempting to sync node with API server" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.404172 4926 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.404253 4926 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.404328 4926 kubelet.go:324] "Adding apiserver pod source" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.404408 4926 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.410287 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.410240 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.410458 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.410433 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.411824 4926 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.413430 4926 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.415040 4926 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417200 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417260 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417285 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417306 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417333 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417348 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417371 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417400 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417415 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417428 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417474 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417523 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.417661 4926 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.418373 4926 server.go:1280] "Started kubelet" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.418842 4926 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.420666 4926 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.420651 4926 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.421115 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:40 crc systemd[1]: Started Kubernetes Kubelet. Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.425778 4926 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.425821 4926 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.428667 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.429028 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="200ms" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.429098 4926 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.429115 4926 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.429107 4926 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.432114 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.432425 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.432822 4926 factory.go:55] Registering systemd factory Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.432865 4926 factory.go:221] Registration of the systemd container factory successfully Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.436024 4926 server.go:460] "Adding debug handlers to kubelet server" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.436246 4926 factory.go:153] Registering CRI-O factory Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.436272 4926 factory.go:221] Registration of the crio container factory successfully Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.436362 4926 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.436402 4926 factory.go:103] Registering Raw factory Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.436429 4926 manager.go:1196] Started watching for new ooms in manager Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.437119 4926 manager.go:319] Starting recovery of all containers Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.435709 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441496 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441618 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441629 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441640 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441650 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441680 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441691 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441700 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441711 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441720 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441729 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441739 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441748 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441762 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441770 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441780 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441789 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441801 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441812 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441821 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441831 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441843 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441858 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.441869 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445398 4926 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445500 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445519 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445561 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445676 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445700 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445748 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445766 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445779 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445844 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445860 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445874 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445884 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445902 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445911 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445921 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445937 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445946 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445960 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445969 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445979 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.445997 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446010 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446019 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446035 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446044 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446061 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446070 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446083 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446099 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446109 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446125 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446137 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446147 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446158 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446243 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446255 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446265 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446275 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446284 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446294 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446304 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446316 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446327 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446344 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446356 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446366 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446376 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446385 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446394 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446405 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446414 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446425 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446456 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446470 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446483 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446494 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446505 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446517 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446535 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446549 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446571 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446583 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446595 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446614 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446742 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446756 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446770 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446788 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446831 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446840 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446856 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446866 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446878 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446894 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446903 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446913 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446923 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446932 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446941 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446950 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.446987 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447004 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447015 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447025 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447036 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447045 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447055 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447065 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447075 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447084 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447118 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447127 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447135 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447145 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447156 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447198 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447216 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447235 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447249 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447268 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447283 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447303 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447314 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447325 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447345 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447367 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447379 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447393 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447406 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447419 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447463 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447519 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447540 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447552 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447571 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447596 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447611 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447629 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447642 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447663 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447681 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447699 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447710 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447721 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447732 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447742 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447793 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447807 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447819 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447830 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447847 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447860 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447872 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447882 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447894 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447912 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447924 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447935 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447946 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447958 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447968 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447981 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.447994 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448011 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448022 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448032 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448043 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448059 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448073 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448089 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448099 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448112 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448123 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448133 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448144 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448156 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448165 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448176 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448186 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448198 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448208 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448221 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448231 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448241 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448256 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448273 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448283 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448294 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448305 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448316 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448328 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448339 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448356 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448373 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448385 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448402 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448413 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448425 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448482 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448493 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448504 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448512 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448522 4926 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448539 4926 reconstruct.go:97] "Volume reconstruction finished" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.448549 4926 reconciler.go:26] "Reconciler: start to sync state" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.458330 4926 manager.go:324] Recovery completed Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.472249 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.473965 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.474007 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.474017 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.474714 4926 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.474731 4926 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.474752 4926 state_mem.go:36] "Initialized new in-memory state store" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.487005 4926 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.488558 4926 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.488625 4926 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.488656 4926 kubelet.go:2335] "Starting kubelet main sync loop" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.488708 4926 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.494812 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.494909 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.498927 4926 policy_none.go:49] "None policy: Start" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.499773 4926 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.499821 4926 state_mem.go:35] "Initializing new in-memory state store" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.529135 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.550722 4926 manager.go:334] "Starting Device Plugin manager" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.550982 4926 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.551113 4926 server.go:79] "Starting device plugin registration server" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.551976 4926 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.552103 4926 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.552461 4926 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.552842 4926 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.552949 4926 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.563138 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.589770 4926 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.589910 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.591215 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.591289 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.591300 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.591514 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.591831 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.591898 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.592665 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.592697 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.592706 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.592824 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.592976 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593033 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593206 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593231 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593242 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593850 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593879 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.593890 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594001 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594093 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594127 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594188 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594239 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594254 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594897 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594949 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.594960 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595127 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595237 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595279 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595777 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595802 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595810 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.595973 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596020 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596037 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596217 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596253 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596348 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596380 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.596394 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.597244 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.597273 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.597284 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.630740 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="400ms" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650789 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650827 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650847 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650863 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650883 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650895 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650908 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650920 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650935 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650947 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650959 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650973 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.650989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.651002 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.651014 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.652500 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.663256 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.663366 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.663377 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.663407 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.663935 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752237 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752325 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752346 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752364 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752388 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752403 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752420 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752452 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752469 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752511 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752530 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752550 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752567 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752584 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.752601 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753296 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753392 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753468 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753416 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753497 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753518 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753495 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753521 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753490 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753551 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753459 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753589 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753222 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753717 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.753421 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.864142 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.865750 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.865814 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.865828 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.865863 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:02:40 crc kubenswrapper[4926]: E0312 18:02:40.866458 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.935340 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.963953 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.975819 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3862009dd2e05853e234112e55fe41ddb949e60b04117f03ddaacbf94d8dd2a6 WatchSource:0}: Error finding container 3862009dd2e05853e234112e55fe41ddb949e60b04117f03ddaacbf94d8dd2a6: Status 404 returned error can't find the container with id 3862009dd2e05853e234112e55fe41ddb949e60b04117f03ddaacbf94d8dd2a6 Mar 12 18:02:40 crc kubenswrapper[4926]: I0312 18:02:40.981980 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 18:02:40 crc kubenswrapper[4926]: W0312 18:02:40.998538 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0ae31a995515ef5752bc5eecbe19c3f22309d3753d138ae5d7757ba52d9f5b43 WatchSource:0}: Error finding container 0ae31a995515ef5752bc5eecbe19c3f22309d3753d138ae5d7757ba52d9f5b43: Status 404 returned error can't find the container with id 0ae31a995515ef5752bc5eecbe19c3f22309d3753d138ae5d7757ba52d9f5b43 Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.008516 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.021815 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:41 crc kubenswrapper[4926]: W0312 18:02:41.024627 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8927119fbdc03b9d6e06195d31070a3a2daded15f8b5c58f073c79dadbf836c4 WatchSource:0}: Error finding container 8927119fbdc03b9d6e06195d31070a3a2daded15f8b5c58f073c79dadbf836c4: Status 404 returned error can't find the container with id 8927119fbdc03b9d6e06195d31070a3a2daded15f8b5c58f073c79dadbf836c4 Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.031656 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="800ms" Mar 12 18:02:41 crc kubenswrapper[4926]: W0312 18:02:41.050106 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-67bdfb711c5b51f994a5d2aa4db25b18ef4e744a124f5de516ca4e4cff5ba2d9 WatchSource:0}: Error finding container 67bdfb711c5b51f994a5d2aa4db25b18ef4e744a124f5de516ca4e4cff5ba2d9: Status 404 returned error can't find the container with id 67bdfb711c5b51f994a5d2aa4db25b18ef4e744a124f5de516ca4e4cff5ba2d9 Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.266922 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.268225 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.268304 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.268323 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.268361 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.268909 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Mar 12 18:02:41 crc kubenswrapper[4926]: W0312 18:02:41.291251 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.291378 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:41 crc kubenswrapper[4926]: W0312 18:02:41.413937 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.414032 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.423153 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.494767 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3862009dd2e05853e234112e55fe41ddb949e60b04117f03ddaacbf94d8dd2a6"} Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.495759 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"67bdfb711c5b51f994a5d2aa4db25b18ef4e744a124f5de516ca4e4cff5ba2d9"} Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.496725 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8927119fbdc03b9d6e06195d31070a3a2daded15f8b5c58f073c79dadbf836c4"} Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.497652 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0ae31a995515ef5752bc5eecbe19c3f22309d3753d138ae5d7757ba52d9f5b43"} Mar 12 18:02:41 crc kubenswrapper[4926]: I0312 18:02:41.498824 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8d840b9a12e76918b83701c57bb79dae78c13ab83e73fd6bc27aeefd9b9a09ba"} Mar 12 18:02:41 crc kubenswrapper[4926]: W0312 18:02:41.546931 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.547034 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.833258 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="1.6s" Mar 12 18:02:41 crc kubenswrapper[4926]: W0312 18:02:41.924269 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:41 crc kubenswrapper[4926]: E0312 18:02:41.924383 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.069633 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.071975 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.072019 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.072032 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.072061 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:02:42 crc kubenswrapper[4926]: E0312 18:02:42.072385 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.313078 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 18:02:42 crc kubenswrapper[4926]: E0312 18:02:42.314149 4926 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.422548 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.503354 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17" exitCode=0 Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.503419 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.503585 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.504904 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.504941 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.504952 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.506024 4926 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f93765ed57dbcebda4b710bf4cafac472d4a67f6d60b259c2314ad5169301c1e" exitCode=0 Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.506133 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f93765ed57dbcebda4b710bf4cafac472d4a67f6d60b259c2314ad5169301c1e"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.506193 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.506610 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.507521 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.507587 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.507607 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.507759 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.507801 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.507821 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.509886 4926 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3" exitCode=0 Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.509986 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.510088 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.511558 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.511611 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.511632 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.512600 4926 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2" exitCode=0 Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.512669 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.512734 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.513735 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.513777 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.513796 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.517147 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b8ff9f1fe3b91c7273624abf9e138c54d1d2228edc8e5ff370cdcc3b8df4a7d5"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.517202 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f6f33c0bfdb670b43186efb3e52df85915bd35749a127245356f71fe96994d85"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.517216 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4557fcde61f57116f59d3965fc85f08395545a7008311738e24afc920c4fbde2"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.517230 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc"} Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.517270 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.518382 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.518471 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:42 crc kubenswrapper[4926]: I0312 18:02:42.518484 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:43 crc kubenswrapper[4926]: W0312 18:02:43.149053 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:43 crc kubenswrapper[4926]: E0312 18:02:43.149156 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.422660 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:43 crc kubenswrapper[4926]: E0312 18:02:43.434555 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="3.2s" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.524012 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.524069 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.524083 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.524095 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.527116 4926 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7fb489c2b03ee964d954776552ca7b84509be8dfcfff33050c034d54a141bc63" exitCode=0 Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.527234 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.527225 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7fb489c2b03ee964d954776552ca7b84509be8dfcfff33050c034d54a141bc63"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.528695 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.528739 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.528752 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.529725 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a00032a5db7175b95edd80be3a15bca3e5bfee1c8bcc8bb2353ab3b620e12b08"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.529753 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.530422 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.530457 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.530467 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.534318 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.534353 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.534370 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a"} Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.534391 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.534401 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.536678 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.536732 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.536749 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.539233 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.539282 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.539294 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.672484 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.673786 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.673826 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.673839 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:43 crc kubenswrapper[4926]: I0312 18:02:43.673866 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:02:43 crc kubenswrapper[4926]: E0312 18:02:43.674318 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Mar 12 18:02:43 crc kubenswrapper[4926]: W0312 18:02:43.894696 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:43 crc kubenswrapper[4926]: E0312 18:02:43.894802 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:44 crc kubenswrapper[4926]: W0312 18:02:44.014289 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:44 crc kubenswrapper[4926]: E0312 18:02:44.014478 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.422229 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.538724 4926 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="63c194a342c3a7d6ac13482ec064af4b05918784b371b742dd4d9e15a3c05cb6" exitCode=0 Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.538839 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"63c194a342c3a7d6ac13482ec064af4b05918784b371b742dd4d9e15a3c05cb6"} Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.538845 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.540272 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.540304 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.540313 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544101 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544112 4926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544083 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c8d19a67716235d9b7738908257e89f8376e18335e640fab2afb758a0f55aa22"} Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544143 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544175 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544846 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544876 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.544886 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.545172 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.545209 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.545175 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.545257 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.545271 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:44 crc kubenswrapper[4926]: I0312 18:02:44.545222 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.550075 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.549990 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"59e6a444813d1bdcdb28520b832548d594b7170869334f17d0e1cb3c5c4c3338"} Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.550141 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"587b28a1d4a204a36e5a2d1828a497d64dd9774cef86641c25dd276fe3185c12"} Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.550154 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4da583451dead089b4282bf5a7ae151758817a7761131745a238968a4a1d9f75"} Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.550164 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c5b57f2b6ccf2e83dc3cdc2b924966715676cca996f717685dda64c8a1556525"} Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.550223 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.551005 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.551057 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:45 crc kubenswrapper[4926]: I0312 18:02:45.551079 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.556938 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"357166a9a18663c77154337ba5b95c2485ed2d008dab9caf790697c54e55da27"} Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.557029 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.557050 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.558467 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.558525 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.558539 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.558566 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.558601 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.558614 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.562131 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.621482 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.874800 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.876621 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.876684 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.876709 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:46 crc kubenswrapper[4926]: I0312 18:02:46.876748 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.103865 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.365671 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.365840 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.366975 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.367027 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.367045 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.558987 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.560033 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.560090 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.560103 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.572305 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.572579 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.574092 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.574138 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.574149 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.823262 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.823498 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.824555 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.824586 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.824597 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:47 crc kubenswrapper[4926]: I0312 18:02:47.905277 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.561336 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.561511 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.562591 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.562638 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.562648 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.562656 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.562671 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:48 crc kubenswrapper[4926]: I0312 18:02:48.562745 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:49 crc kubenswrapper[4926]: I0312 18:02:49.323565 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:49 crc kubenswrapper[4926]: I0312 18:02:49.323828 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:49 crc kubenswrapper[4926]: I0312 18:02:49.325368 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:49 crc kubenswrapper[4926]: I0312 18:02:49.325431 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:49 crc kubenswrapper[4926]: I0312 18:02:49.325491 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:50 crc kubenswrapper[4926]: E0312 18:02:50.563269 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:02:51 crc kubenswrapper[4926]: I0312 18:02:51.278762 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:51 crc kubenswrapper[4926]: I0312 18:02:51.279493 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:51 crc kubenswrapper[4926]: I0312 18:02:51.281002 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:51 crc kubenswrapper[4926]: I0312 18:02:51.281262 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:51 crc kubenswrapper[4926]: I0312 18:02:51.281351 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.439196 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.439430 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.441045 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.441101 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.441119 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.446170 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.570987 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.572299 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.572380 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.572395 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:52 crc kubenswrapper[4926]: I0312 18:02:52.577144 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:02:53 crc kubenswrapper[4926]: I0312 18:02:53.572961 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:53 crc kubenswrapper[4926]: I0312 18:02:53.573978 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:53 crc kubenswrapper[4926]: I0312 18:02:53.574039 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:53 crc kubenswrapper[4926]: I0312 18:02:53.574064 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.279502 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.279591 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.579044 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.581723 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c8d19a67716235d9b7738908257e89f8376e18335e640fab2afb758a0f55aa22" exitCode=255 Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.581790 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c8d19a67716235d9b7738908257e89f8376e18335e640fab2afb758a0f55aa22"} Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.582067 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.583518 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.583576 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.583588 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.584321 4926 scope.go:117] "RemoveContainer" containerID="c8d19a67716235d9b7738908257e89f8376e18335e640fab2afb758a0f55aa22" Mar 12 18:02:54 crc kubenswrapper[4926]: W0312 18:02:54.599482 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.599624 4926 trace.go:236] Trace[1716282484]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 18:02:44.597) (total time: 10001ms): Mar 12 18:02:54 crc kubenswrapper[4926]: Trace[1716282484]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:02:54.599) Mar 12 18:02:54 crc kubenswrapper[4926]: Trace[1716282484]: [10.001659953s] [10.001659953s] END Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.599660 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.924678 4926 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.926577 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.926784 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z Mar 12 18:02:54 crc kubenswrapper[4926]: W0312 18:02:54.928631 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.928711 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.929550 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.929802 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 12 18:02:54 crc kubenswrapper[4926]: W0312 18:02:54.933530 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.933667 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.936747 4926 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.936824 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 18:02:54 crc kubenswrapper[4926]: W0312 18:02:54.939242 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z Mar 12 18:02:54 crc kubenswrapper[4926]: E0312 18:02:54.939418 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.943540 4926 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 18:02:54 crc kubenswrapper[4926]: I0312 18:02:54.943623 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.427908 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:55Z is after 2026-02-23T05:33:13Z Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.586163 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.588518 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac"} Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.588754 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.589983 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.590058 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:55 crc kubenswrapper[4926]: I0312 18:02:55.590166 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.428272 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:56Z is after 2026-02-23T05:33:13Z Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.595180 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.595886 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.598455 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" exitCode=255 Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.598522 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac"} Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.598659 4926 scope.go:117] "RemoveContainer" containerID="c8d19a67716235d9b7738908257e89f8376e18335e640fab2afb758a0f55aa22" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.598830 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.600053 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.600125 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.600151 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.601181 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:02:56 crc kubenswrapper[4926]: E0312 18:02:56.601612 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.651099 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.651431 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.653241 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.653301 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.653316 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:56 crc kubenswrapper[4926]: I0312 18:02:56.671909 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.425722 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:57Z is after 2026-02-23T05:33:13Z Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.603922 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.606215 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.607175 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.607219 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.607230 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.911689 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.911896 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.913377 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.913417 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.913429 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.913982 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:02:57 crc kubenswrapper[4926]: E0312 18:02:57.914598 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:02:57 crc kubenswrapper[4926]: I0312 18:02:57.919580 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:02:57 crc kubenswrapper[4926]: W0312 18:02:57.933427 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:57Z is after 2026-02-23T05:33:13Z Mar 12 18:02:57 crc kubenswrapper[4926]: E0312 18:02:57.933561 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:02:58 crc kubenswrapper[4926]: I0312 18:02:58.425490 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:58Z is after 2026-02-23T05:33:13Z Mar 12 18:02:58 crc kubenswrapper[4926]: I0312 18:02:58.608225 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:02:58 crc kubenswrapper[4926]: I0312 18:02:58.609369 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:02:58 crc kubenswrapper[4926]: I0312 18:02:58.609428 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:02:58 crc kubenswrapper[4926]: I0312 18:02:58.609464 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:02:58 crc kubenswrapper[4926]: I0312 18:02:58.610177 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:02:58 crc kubenswrapper[4926]: E0312 18:02:58.610392 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:02:59 crc kubenswrapper[4926]: I0312 18:02:59.426932 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:02:59Z is after 2026-02-23T05:33:13Z Mar 12 18:03:00 crc kubenswrapper[4926]: I0312 18:03:00.426676 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:00Z is after 2026-02-23T05:33:13Z Mar 12 18:03:00 crc kubenswrapper[4926]: E0312 18:03:00.563427 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.229518 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.229777 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.231368 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.231510 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.231542 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.232413 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:03:01 crc kubenswrapper[4926]: E0312 18:03:01.232722 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.329880 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.331602 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.331687 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.331709 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.331754 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:01 crc kubenswrapper[4926]: E0312 18:03:01.333109 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 18:03:01 crc kubenswrapper[4926]: E0312 18:03:01.334834 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.338046 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.428962 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:01Z is after 2026-02-23T05:33:13Z Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.617172 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.618551 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.618608 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.618622 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:01 crc kubenswrapper[4926]: I0312 18:03:01.619410 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:03:01 crc kubenswrapper[4926]: E0312 18:03:01.619654 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:02 crc kubenswrapper[4926]: I0312 18:03:02.428219 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:02Z is after 2026-02-23T05:33:13Z Mar 12 18:03:03 crc kubenswrapper[4926]: I0312 18:03:03.305250 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 18:03:03 crc kubenswrapper[4926]: E0312 18:03:03.310935 4926 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:03 crc kubenswrapper[4926]: I0312 18:03:03.427079 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:03Z is after 2026-02-23T05:33:13Z Mar 12 18:03:03 crc kubenswrapper[4926]: W0312 18:03:03.694348 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:03Z is after 2026-02-23T05:33:13Z Mar 12 18:03:03 crc kubenswrapper[4926]: E0312 18:03:03.694483 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:04 crc kubenswrapper[4926]: I0312 18:03:04.278954 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 18:03:04 crc kubenswrapper[4926]: I0312 18:03:04.279562 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 18:03:04 crc kubenswrapper[4926]: I0312 18:03:04.425984 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:04Z is after 2026-02-23T05:33:13Z Mar 12 18:03:04 crc kubenswrapper[4926]: E0312 18:03:04.932291 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:04Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:05 crc kubenswrapper[4926]: W0312 18:03:05.157362 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:05Z is after 2026-02-23T05:33:13Z Mar 12 18:03:05 crc kubenswrapper[4926]: E0312 18:03:05.157505 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:05 crc kubenswrapper[4926]: I0312 18:03:05.425011 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:05Z is after 2026-02-23T05:33:13Z Mar 12 18:03:06 crc kubenswrapper[4926]: I0312 18:03:06.427378 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:06Z is after 2026-02-23T05:33:13Z Mar 12 18:03:07 crc kubenswrapper[4926]: I0312 18:03:07.427554 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:07Z is after 2026-02-23T05:33:13Z Mar 12 18:03:07 crc kubenswrapper[4926]: W0312 18:03:07.655666 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:07Z is after 2026-02-23T05:33:13Z Mar 12 18:03:07 crc kubenswrapper[4926]: E0312 18:03:07.655781 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:08 crc kubenswrapper[4926]: I0312 18:03:08.335098 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:08 crc kubenswrapper[4926]: I0312 18:03:08.337140 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:08 crc kubenswrapper[4926]: I0312 18:03:08.337220 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:08 crc kubenswrapper[4926]: I0312 18:03:08.337240 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:08 crc kubenswrapper[4926]: I0312 18:03:08.337275 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:08 crc kubenswrapper[4926]: E0312 18:03:08.339156 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:08Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 18:03:08 crc kubenswrapper[4926]: E0312 18:03:08.341147 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:03:08 crc kubenswrapper[4926]: I0312 18:03:08.425167 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:08Z is after 2026-02-23T05:33:13Z Mar 12 18:03:09 crc kubenswrapper[4926]: I0312 18:03:09.426354 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:09Z is after 2026-02-23T05:33:13Z Mar 12 18:03:09 crc kubenswrapper[4926]: W0312 18:03:09.824656 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:09Z is after 2026-02-23T05:33:13Z Mar 12 18:03:09 crc kubenswrapper[4926]: E0312 18:03:09.824738 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:09Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:10 crc kubenswrapper[4926]: I0312 18:03:10.424865 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:10Z is after 2026-02-23T05:33:13Z Mar 12 18:03:10 crc kubenswrapper[4926]: E0312 18:03:10.563800 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:03:11 crc kubenswrapper[4926]: I0312 18:03:11.426529 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:11Z is after 2026-02-23T05:33:13Z Mar 12 18:03:12 crc kubenswrapper[4926]: I0312 18:03:12.427402 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:12Z is after 2026-02-23T05:33:13Z Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.254774 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:46140->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.254848 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:46140->192.168.126.11:10357: read: connection reset by peer" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.254914 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.255068 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.256222 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.256274 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.256285 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.257010 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"4557fcde61f57116f59d3965fc85f08395545a7008311738e24afc920c4fbde2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.257169 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://4557fcde61f57116f59d3965fc85f08395545a7008311738e24afc920c4fbde2" gracePeriod=30 Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.425109 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:13Z is after 2026-02-23T05:33:13Z Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.489228 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.490867 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.490901 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.490912 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.491462 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.652856 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.653716 4926 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="4557fcde61f57116f59d3965fc85f08395545a7008311738e24afc920c4fbde2" exitCode=255 Mar 12 18:03:13 crc kubenswrapper[4926]: I0312 18:03:13.653771 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"4557fcde61f57116f59d3965fc85f08395545a7008311738e24afc920c4fbde2"} Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.425272 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:14Z is after 2026-02-23T05:33:13Z Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.659418 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.661169 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97"} Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.661334 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.662148 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.662173 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.662181 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.663551 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.663909 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8"} Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.663968 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.664699 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.664743 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:14 crc kubenswrapper[4926]: I0312 18:03:14.664760 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:14 crc kubenswrapper[4926]: E0312 18:03:14.935532 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:14Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.341936 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:15 crc kubenswrapper[4926]: E0312 18:03:15.342687 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:15Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.343257 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.343301 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.343316 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.343342 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:15 crc kubenswrapper[4926]: E0312 18:03:15.345798 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:15Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.425697 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:15Z is after 2026-02-23T05:33:13Z Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.668655 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.669594 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.671742 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" exitCode=255 Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.671821 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97"} Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.671904 4926 scope.go:117] "RemoveContainer" containerID="0bb2797eb4e103e38988a8ca21a8dbc1acd77e1a462c8c15d187a78d48b0fdac" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.671940 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.672112 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.673645 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.673685 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.673715 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.674102 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.674165 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.674189 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:15 crc kubenswrapper[4926]: I0312 18:03:15.675289 4926 scope.go:117] "RemoveContainer" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" Mar 12 18:03:15 crc kubenswrapper[4926]: E0312 18:03:15.675697 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:16 crc kubenswrapper[4926]: I0312 18:03:16.426226 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:16Z is after 2026-02-23T05:33:13Z Mar 12 18:03:16 crc kubenswrapper[4926]: I0312 18:03:16.675854 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 18:03:17 crc kubenswrapper[4926]: I0312 18:03:17.428061 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:17Z is after 2026-02-23T05:33:13Z Mar 12 18:03:17 crc kubenswrapper[4926]: I0312 18:03:17.572563 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:17 crc kubenswrapper[4926]: I0312 18:03:17.572743 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:17 crc kubenswrapper[4926]: I0312 18:03:17.574185 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:17 crc kubenswrapper[4926]: I0312 18:03:17.574215 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:17 crc kubenswrapper[4926]: I0312 18:03:17.574225 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:18 crc kubenswrapper[4926]: I0312 18:03:18.426128 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:18Z is after 2026-02-23T05:33:13Z Mar 12 18:03:19 crc kubenswrapper[4926]: I0312 18:03:19.427547 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:19Z is after 2026-02-23T05:33:13Z Mar 12 18:03:19 crc kubenswrapper[4926]: I0312 18:03:19.702285 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 18:03:19 crc kubenswrapper[4926]: E0312 18:03:19.708767 4926 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:19 crc kubenswrapper[4926]: E0312 18:03:19.710482 4926 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 12 18:03:20 crc kubenswrapper[4926]: I0312 18:03:20.424741 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:20Z is after 2026-02-23T05:33:13Z Mar 12 18:03:20 crc kubenswrapper[4926]: E0312 18:03:20.563991 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.229553 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.229857 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.231394 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.231467 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.231486 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.232257 4926 scope.go:117] "RemoveContainer" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" Mar 12 18:03:21 crc kubenswrapper[4926]: E0312 18:03:21.232709 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.279047 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.279248 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.280365 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.280593 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.280736 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.337857 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.427202 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:21Z is after 2026-02-23T05:33:13Z Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.691956 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.693490 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.693550 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.693567 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:21 crc kubenswrapper[4926]: I0312 18:03:21.694366 4926 scope.go:117] "RemoveContainer" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" Mar 12 18:03:21 crc kubenswrapper[4926]: E0312 18:03:21.694656 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:22 crc kubenswrapper[4926]: I0312 18:03:22.346353 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:22 crc kubenswrapper[4926]: I0312 18:03:22.348285 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:22 crc kubenswrapper[4926]: I0312 18:03:22.348348 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:22 crc kubenswrapper[4926]: I0312 18:03:22.348372 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:22 crc kubenswrapper[4926]: I0312 18:03:22.348418 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:22 crc kubenswrapper[4926]: E0312 18:03:22.348762 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:22Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 18:03:22 crc kubenswrapper[4926]: E0312 18:03:22.353564 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:22Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:03:22 crc kubenswrapper[4926]: I0312 18:03:22.427168 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:22Z is after 2026-02-23T05:33:13Z Mar 12 18:03:22 crc kubenswrapper[4926]: W0312 18:03:22.794830 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:22Z is after 2026-02-23T05:33:13Z Mar 12 18:03:22 crc kubenswrapper[4926]: E0312 18:03:22.794909 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:23 crc kubenswrapper[4926]: I0312 18:03:23.424999 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:23Z is after 2026-02-23T05:33:13Z Mar 12 18:03:24 crc kubenswrapper[4926]: I0312 18:03:24.280042 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 18:03:24 crc kubenswrapper[4926]: I0312 18:03:24.280192 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 18:03:24 crc kubenswrapper[4926]: I0312 18:03:24.427842 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:24Z is after 2026-02-23T05:33:13Z Mar 12 18:03:24 crc kubenswrapper[4926]: W0312 18:03:24.875317 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:24Z is after 2026-02-23T05:33:13Z Mar 12 18:03:24 crc kubenswrapper[4926]: E0312 18:03:24.875395 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:24 crc kubenswrapper[4926]: E0312 18:03:24.939541 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:24Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:25 crc kubenswrapper[4926]: I0312 18:03:25.425005 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:25Z is after 2026-02-23T05:33:13Z Mar 12 18:03:26 crc kubenswrapper[4926]: I0312 18:03:26.424818 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:26Z is after 2026-02-23T05:33:13Z Mar 12 18:03:27 crc kubenswrapper[4926]: I0312 18:03:27.371025 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 18:03:27 crc kubenswrapper[4926]: I0312 18:03:27.371224 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:27 crc kubenswrapper[4926]: I0312 18:03:27.372207 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:27 crc kubenswrapper[4926]: I0312 18:03:27.372243 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:27 crc kubenswrapper[4926]: I0312 18:03:27.372252 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:27 crc kubenswrapper[4926]: I0312 18:03:27.425512 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:27Z is after 2026-02-23T05:33:13Z Mar 12 18:03:28 crc kubenswrapper[4926]: I0312 18:03:28.425511 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:28Z is after 2026-02-23T05:33:13Z Mar 12 18:03:29 crc kubenswrapper[4926]: I0312 18:03:29.353749 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:29 crc kubenswrapper[4926]: E0312 18:03:29.353855 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:29Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 18:03:29 crc kubenswrapper[4926]: I0312 18:03:29.354724 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:29 crc kubenswrapper[4926]: I0312 18:03:29.354757 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:29 crc kubenswrapper[4926]: I0312 18:03:29.354767 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:29 crc kubenswrapper[4926]: I0312 18:03:29.354789 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:29 crc kubenswrapper[4926]: E0312 18:03:29.357668 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:03:29 crc kubenswrapper[4926]: I0312 18:03:29.426603 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:29Z is after 2026-02-23T05:33:13Z Mar 12 18:03:30 crc kubenswrapper[4926]: I0312 18:03:30.426785 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:30Z is after 2026-02-23T05:33:13Z Mar 12 18:03:30 crc kubenswrapper[4926]: E0312 18:03:30.564137 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:03:31 crc kubenswrapper[4926]: I0312 18:03:31.425750 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:31Z is after 2026-02-23T05:33:13Z Mar 12 18:03:32 crc kubenswrapper[4926]: W0312 18:03:32.395234 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:32Z is after 2026-02-23T05:33:13Z Mar 12 18:03:32 crc kubenswrapper[4926]: E0312 18:03:32.395311 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:32 crc kubenswrapper[4926]: I0312 18:03:32.427936 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:32Z is after 2026-02-23T05:33:13Z Mar 12 18:03:33 crc kubenswrapper[4926]: I0312 18:03:33.425124 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:33Z is after 2026-02-23T05:33:13Z Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.279846 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.279906 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.425061 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:34Z is after 2026-02-23T05:33:13Z Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.489351 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.490587 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.490619 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.490630 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:34 crc kubenswrapper[4926]: I0312 18:03:34.491152 4926 scope.go:117] "RemoveContainer" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" Mar 12 18:03:34 crc kubenswrapper[4926]: E0312 18:03:34.491328 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:34 crc kubenswrapper[4926]: E0312 18:03:34.945321 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:34Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:35 crc kubenswrapper[4926]: W0312 18:03:35.402972 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:35Z is after 2026-02-23T05:33:13Z Mar 12 18:03:35 crc kubenswrapper[4926]: E0312 18:03:35.403095 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 18:03:35 crc kubenswrapper[4926]: I0312 18:03:35.425190 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:35Z is after 2026-02-23T05:33:13Z Mar 12 18:03:36 crc kubenswrapper[4926]: I0312 18:03:36.358166 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:36 crc kubenswrapper[4926]: I0312 18:03:36.359581 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:36 crc kubenswrapper[4926]: I0312 18:03:36.359644 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:36 crc kubenswrapper[4926]: I0312 18:03:36.359669 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:36 crc kubenswrapper[4926]: I0312 18:03:36.359711 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:36 crc kubenswrapper[4926]: E0312 18:03:36.361958 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 18:03:36 crc kubenswrapper[4926]: E0312 18:03:36.364407 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 18:03:36 crc kubenswrapper[4926]: I0312 18:03:36.424625 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:36Z is after 2026-02-23T05:33:13Z Mar 12 18:03:37 crc kubenswrapper[4926]: I0312 18:03:37.424899 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:37Z is after 2026-02-23T05:33:13Z Mar 12 18:03:38 crc kubenswrapper[4926]: I0312 18:03:38.426981 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:38Z is after 2026-02-23T05:33:13Z Mar 12 18:03:39 crc kubenswrapper[4926]: I0312 18:03:39.425545 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:39Z is after 2026-02-23T05:33:13Z Mar 12 18:03:40 crc kubenswrapper[4926]: I0312 18:03:40.426184 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:03:40Z is after 2026-02-23T05:33:13Z Mar 12 18:03:40 crc kubenswrapper[4926]: E0312 18:03:40.564287 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:03:41 crc kubenswrapper[4926]: I0312 18:03:41.429325 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:42 crc kubenswrapper[4926]: I0312 18:03:42.428888 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:43 crc kubenswrapper[4926]: I0312 18:03:43.364725 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:43 crc kubenswrapper[4926]: I0312 18:03:43.367503 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:43 crc kubenswrapper[4926]: I0312 18:03:43.367619 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:43 crc kubenswrapper[4926]: I0312 18:03:43.367642 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:43 crc kubenswrapper[4926]: I0312 18:03:43.367690 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:43 crc kubenswrapper[4926]: E0312 18:03:43.369321 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 18:03:43 crc kubenswrapper[4926]: E0312 18:03:43.370149 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 18:03:43 crc kubenswrapper[4926]: I0312 18:03:43.431671 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.278542 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.278686 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.278773 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.278968 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.280242 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.280291 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.280308 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.280949 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.281099 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8" gracePeriod=30 Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.428653 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.750201 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.751370 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.751811 4926 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8" exitCode=255 Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.751854 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8"} Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.751880 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"405f59d0da6c9a3663ed746f08f9d5c2d94818971dbc0ce0373690c731b5afae"} Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.751898 4926 scope.go:117] "RemoveContainer" containerID="4557fcde61f57116f59d3965fc85f08395545a7008311738e24afc920c4fbde2" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.752003 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.752933 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.752988 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:44 crc kubenswrapper[4926]: I0312 18:03:44.753005 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.949381 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06703cc5e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,LastTimestamp:2026-03-12 18:02:40.418334179 +0000 UTC m=+0.786960552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.954571 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.960461 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.964034 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.967558 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a067856139d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.554210205 +0000 UTC m=+0.922836558,LastTimestamp:2026-03-12 18:02:40.554210205 +0000 UTC m=+0.922836558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.971675 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.591277422 +0000 UTC m=+0.959903755,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.975104 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.591295923 +0000 UTC m=+0.959922256,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.980649 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e8176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.591325114 +0000 UTC m=+0.959951447,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.986428 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.592684351 +0000 UTC m=+0.961310684,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.989840 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.592703172 +0000 UTC m=+0.961329495,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.993327 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e8176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.592713192 +0000 UTC m=+0.961339525,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:44 crc kubenswrapper[4926]: E0312 18:03:44.998549 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.593225781 +0000 UTC m=+0.961852114,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.002382 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.593236761 +0000 UTC m=+0.961863094,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.005735 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e8176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.593247111 +0000 UTC m=+0.961873444,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.009822 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.593868664 +0000 UTC m=+0.962495007,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.014508 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.593885395 +0000 UTC m=+0.962511728,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.023359 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e8176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.593896115 +0000 UTC m=+0.962522448,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.029380 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.594219377 +0000 UTC m=+0.962845710,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.035891 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.594248538 +0000 UTC m=+0.962874871,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.042105 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e8176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.59428994 +0000 UTC m=+0.962916273,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.046295 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.594938093 +0000 UTC m=+0.963564426,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.050529 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.594956754 +0000 UTC m=+0.963583087,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.054501 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e8176\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e8176 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474022262 +0000 UTC m=+0.842648595,LastTimestamp:2026-03-12 18:02:40.594966864 +0000 UTC m=+0.963593187,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.058127 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e238d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e238d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.473998221 +0000 UTC m=+0.842624554,LastTimestamp:2026-03-12 18:02:40.595792374 +0000 UTC m=+0.964418707,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.062940 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c2a06738e5dfe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c2a06738e5dfe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.474013182 +0000 UTC m=+0.842639515,LastTimestamp:2026-03-12 18:02:40.595807374 +0000 UTC m=+0.964433707,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.068804 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a0691e1fd66 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:40.982809958 +0000 UTC m=+1.351436291,LastTimestamp:2026-03-12 18:02:40.982809958 +0000 UTC m=+1.351436291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.073508 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c2a0692e9e595 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.000105365 +0000 UTC m=+1.368731758,LastTimestamp:2026-03-12 18:02:41.000105365 +0000 UTC m=+1.368731758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.078421 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0692fefd8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.001487755 +0000 UTC m=+1.370114088,LastTimestamp:2026-03-12 18:02:41.001487755 +0000 UTC m=+1.370114088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.083508 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06949e358c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.028699532 +0000 UTC m=+1.397325865,LastTimestamp:2026-03-12 18:02:41.028699532 +0000 UTC m=+1.397325865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.088126 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06961efa2f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.053915695 +0000 UTC m=+1.422542028,LastTimestamp:2026-03-12 18:02:41.053915695 +0000 UTC m=+1.422542028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.092971 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a06b3bba266 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.550721638 +0000 UTC m=+1.919347971,LastTimestamp:2026-03-12 18:02:41.550721638 +0000 UTC m=+1.919347971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.098840 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06b3d4273a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.552328506 +0000 UTC m=+1.920954839,LastTimestamp:2026-03-12 18:02:41.552328506 +0000 UTC m=+1.920954839,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.102790 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a06b440585d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.559418973 +0000 UTC m=+1.928045306,LastTimestamp:2026-03-12 18:02:41.559418973 +0000 UTC m=+1.928045306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.107469 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c2a06b4c723a1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.568252833 +0000 UTC m=+1.936879166,LastTimestamp:2026-03-12 18:02:41.568252833 +0000 UTC m=+1.936879166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.109802 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06b4d11a59 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.568905817 +0000 UTC m=+1.937532160,LastTimestamp:2026-03-12 18:02:41.568905817 +0000 UTC m=+1.937532160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.111828 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a06b4d16a5a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.568926298 +0000 UTC m=+1.937552641,LastTimestamp:2026-03-12 18:02:41.568926298 +0000 UTC m=+1.937552641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.114547 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06b4db8c64 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.569590372 +0000 UTC m=+1.938216725,LastTimestamp:2026-03-12 18:02:41.569590372 +0000 UTC m=+1.938216725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.116123 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06b4e37904 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.5701097 +0000 UTC m=+1.938736043,LastTimestamp:2026-03-12 18:02:41.5701097 +0000 UTC m=+1.938736043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.118152 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a06b5e85e1e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.58720771 +0000 UTC m=+1.955834043,LastTimestamp:2026-03-12 18:02:41.58720771 +0000 UTC m=+1.955834043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.120812 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c2a06b5fcc745 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.588545349 +0000 UTC m=+1.957171682,LastTimestamp:2026-03-12 18:02:41.588545349 +0000 UTC m=+1.957171682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.121988 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06b6035769 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.588975465 +0000 UTC m=+1.957601788,LastTimestamp:2026-03-12 18:02:41.588975465 +0000 UTC m=+1.957601788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.126137 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06c74187ff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.878263807 +0000 UTC m=+2.246890180,LastTimestamp:2026-03-12 18:02:41.878263807 +0000 UTC m=+2.246890180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.130185 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06c7ece494 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.889494164 +0000 UTC m=+2.258120527,LastTimestamp:2026-03-12 18:02:41.889494164 +0000 UTC m=+2.258120527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.134269 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06c801c6ac openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.890862764 +0000 UTC m=+2.259489137,LastTimestamp:2026-03-12 18:02:41.890862764 +0000 UTC m=+2.259489137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.138754 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06d60c379c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.12642806 +0000 UTC m=+2.495054433,LastTimestamp:2026-03-12 18:02:42.12642806 +0000 UTC m=+2.495054433,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.143148 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06d6eec9e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.141276648 +0000 UTC m=+2.509903021,LastTimestamp:2026-03-12 18:02:42.141276648 +0000 UTC m=+2.509903021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.146760 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06d701b48b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.142516363 +0000 UTC m=+2.511142736,LastTimestamp:2026-03-12 18:02:42.142516363 +0000 UTC m=+2.511142736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.149937 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06e1a8997d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.321226109 +0000 UTC m=+2.689852452,LastTimestamp:2026-03-12 18:02:42.321226109 +0000 UTC m=+2.689852452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.153201 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06e240ddf6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.33120511 +0000 UTC m=+2.699831463,LastTimestamp:2026-03-12 18:02:42.33120511 +0000 UTC m=+2.699831463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.157887 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06ecb30c3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.506460221 +0000 UTC m=+2.875086554,LastTimestamp:2026-03-12 18:02:42.506460221 +0000 UTC m=+2.875086554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.162595 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a06eced1d3c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.51026566 +0000 UTC m=+2.878891993,LastTimestamp:2026-03-12 18:02:42.51026566 +0000 UTC m=+2.878891993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.166548 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c2a06ed2f3e2a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.514599466 +0000 UTC m=+2.883225829,LastTimestamp:2026-03-12 18:02:42.514599466 +0000 UTC m=+2.883225829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.171550 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a06ed4c2a93 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.516494995 +0000 UTC m=+2.885121328,LastTimestamp:2026-03-12 18:02:42.516494995 +0000 UTC m=+2.885121328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.175308 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06f9cb7081 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.726162561 +0000 UTC m=+3.094788894,LastTimestamp:2026-03-12 18:02:42.726162561 +0000 UTC m=+3.094788894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.179043 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a06f9e98f7b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.728136571 +0000 UTC m=+3.096762904,LastTimestamp:2026-03-12 18:02:42.728136571 +0000 UTC m=+3.096762904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.182254 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a06f9ea0fc5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.728169413 +0000 UTC m=+3.096795756,LastTimestamp:2026-03-12 18:02:42.728169413 +0000 UTC m=+3.096795756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.185752 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c2a06fa34bc77 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.733063287 +0000 UTC m=+3.101689620,LastTimestamp:2026-03-12 18:02:42.733063287 +0000 UTC m=+3.101689620,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.191067 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06fad6380f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.743646223 +0000 UTC m=+3.112272556,LastTimestamp:2026-03-12 18:02:42.743646223 +0000 UTC m=+3.112272556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.194582 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a06fae60544 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.744681796 +0000 UTC m=+3.113308129,LastTimestamp:2026-03-12 18:02:42.744681796 +0000 UTC m=+3.113308129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.198814 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a06fafcfc5e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.746186846 +0000 UTC m=+3.114813199,LastTimestamp:2026-03-12 18:02:42.746186846 +0000 UTC m=+3.114813199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.202292 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a06fb140298 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.747695768 +0000 UTC m=+3.116322111,LastTimestamp:2026-03-12 18:02:42.747695768 +0000 UTC m=+3.116322111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.206891 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a06fb8e1d27 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.755697959 +0000 UTC m=+3.124324282,LastTimestamp:2026-03-12 18:02:42.755697959 +0000 UTC m=+3.124324282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.210329 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c2a06fbc62132 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.75936901 +0000 UTC m=+3.127995343,LastTimestamp:2026-03-12 18:02:42.75936901 +0000 UTC m=+3.127995343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.212276 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a0705d2c206 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.927968774 +0000 UTC m=+3.296595107,LastTimestamp:2026-03-12 18:02:42.927968774 +0000 UTC m=+3.296595107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.214022 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a0705d36709 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.928011017 +0000 UTC m=+3.296637350,LastTimestamp:2026-03-12 18:02:42.928011017 +0000 UTC m=+3.296637350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.216433 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a070688522a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.93986769 +0000 UTC m=+3.308494023,LastTimestamp:2026-03-12 18:02:42.93986769 +0000 UTC m=+3.308494023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.218353 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a0706992809 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.940971017 +0000 UTC m=+3.309597370,LastTimestamp:2026-03-12 18:02:42.940971017 +0000 UTC m=+3.309597370,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.222206 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a0706aa32c2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.942087874 +0000 UTC m=+3.310714197,LastTimestamp:2026-03-12 18:02:42.942087874 +0000 UTC m=+3.310714197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.227015 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a0706b30627 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:42.942666279 +0000 UTC m=+3.311292632,LastTimestamp:2026-03-12 18:02:42.942666279 +0000 UTC m=+3.311292632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.231237 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a07161f395e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.201415518 +0000 UTC m=+3.570041871,LastTimestamp:2026-03-12 18:02:43.201415518 +0000 UTC m=+3.570041871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.235997 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a07163fde1f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.203554847 +0000 UTC m=+3.572181190,LastTimestamp:2026-03-12 18:02:43.203554847 +0000 UTC m=+3.572181190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.240606 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c2a0716c78a36 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.212446262 +0000 UTC m=+3.581072595,LastTimestamp:2026-03-12 18:02:43.212446262 +0000 UTC m=+3.581072595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.244432 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a07172c68e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.219056869 +0000 UTC m=+3.587683202,LastTimestamp:2026-03-12 18:02:43.219056869 +0000 UTC m=+3.587683202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.251785 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a07173d3762 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.220158306 +0000 UTC m=+3.588784639,LastTimestamp:2026-03-12 18:02:43.220158306 +0000 UTC m=+3.588784639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.256818 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a07221403f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.402007538 +0000 UTC m=+3.770633871,LastTimestamp:2026-03-12 18:02:43.402007538 +0000 UTC m=+3.770633871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.260649 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a0722c5eb63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.413666659 +0000 UTC m=+3.782293012,LastTimestamp:2026-03-12 18:02:43.413666659 +0000 UTC m=+3.782293012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.265933 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a0722df5fb7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.415334839 +0000 UTC m=+3.783961192,LastTimestamp:2026-03-12 18:02:43.415334839 +0000 UTC m=+3.783961192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.271272 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0729b6b165 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.530109285 +0000 UTC m=+3.898735618,LastTimestamp:2026-03-12 18:02:43.530109285 +0000 UTC m=+3.898735618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.276309 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a072dc99b9e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.598457758 +0000 UTC m=+3.967084101,LastTimestamp:2026-03-12 18:02:43.598457758 +0000 UTC m=+3.967084101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.280620 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a072e7bd965 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.610138981 +0000 UTC m=+3.978765324,LastTimestamp:2026-03-12 18:02:43.610138981 +0000 UTC m=+3.978765324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.284970 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a07354dde28 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.724566056 +0000 UTC m=+4.093192389,LastTimestamp:2026-03-12 18:02:43.724566056 +0000 UTC m=+4.093192389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.289931 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0735fba04a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.735953482 +0000 UTC m=+4.104579815,LastTimestamp:2026-03-12 18:02:43.735953482 +0000 UTC m=+4.104579815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.294614 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0766053795 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:44.541888405 +0000 UTC m=+4.910514738,LastTimestamp:2026-03-12 18:02:44.541888405 +0000 UTC m=+4.910514738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.299327 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0775962516 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:44.80304463 +0000 UTC m=+5.171670963,LastTimestamp:2026-03-12 18:02:44.80304463 +0000 UTC m=+5.171670963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.303907 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a077666ba34 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:44.816714292 +0000 UTC m=+5.185340625,LastTimestamp:2026-03-12 18:02:44.816714292 +0000 UTC m=+5.185340625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.308762 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0776763400 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:44.817728512 +0000 UTC m=+5.186354835,LastTimestamp:2026-03-12 18:02:44.817728512 +0000 UTC m=+5.186354835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.313534 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0783c56a8d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.041023629 +0000 UTC m=+5.409649962,LastTimestamp:2026-03-12 18:02:45.041023629 +0000 UTC m=+5.409649962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.316014 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0784e860c0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.060092096 +0000 UTC m=+5.428718429,LastTimestamp:2026-03-12 18:02:45.060092096 +0000 UTC m=+5.428718429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.319583 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0784fd7f71 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.061476209 +0000 UTC m=+5.430102542,LastTimestamp:2026-03-12 18:02:45.061476209 +0000 UTC m=+5.430102542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.323790 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a0792a76c66 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.290716262 +0000 UTC m=+5.659342595,LastTimestamp:2026-03-12 18:02:45.290716262 +0000 UTC m=+5.659342595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.328228 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a079330c71a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.299717914 +0000 UTC m=+5.668344247,LastTimestamp:2026-03-12 18:02:45.299717914 +0000 UTC m=+5.668344247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.332930 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a07933ec74e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.30063547 +0000 UTC m=+5.669261803,LastTimestamp:2026-03-12 18:02:45.30063547 +0000 UTC m=+5.669261803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.336637 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a079e6cf416 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.488210966 +0000 UTC m=+5.856837309,LastTimestamp:2026-03-12 18:02:45.488210966 +0000 UTC m=+5.856837309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.340683 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a079efe4905 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.497735429 +0000 UTC m=+5.866361752,LastTimestamp:2026-03-12 18:02:45.497735429 +0000 UTC m=+5.866361752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.344671 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a079f0b4642 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.49858669 +0000 UTC m=+5.867213023,LastTimestamp:2026-03-12 18:02:45.49858669 +0000 UTC m=+5.867213023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.349144 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a07a96a1379 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.672571769 +0000 UTC m=+6.041198092,LastTimestamp:2026-03-12 18:02:45.672571769 +0000 UTC m=+6.041198092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.353328 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c2a07aa444dce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:45.68687355 +0000 UTC m=+6.055499883,LastTimestamp:2026-03-12 18:02:45.68687355 +0000 UTC m=+6.055499883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.359186 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6e4922 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 18:03:45 crc kubenswrapper[4926]: body: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.279559458 +0000 UTC m=+14.648185801,LastTimestamp:2026-03-12 18:02:54.279559458 +0000 UTC m=+14.648185801,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.362920 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6f404a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.27962273 +0000 UTC m=+14.648249073,LastTimestamp:2026-03-12 18:02:54.27962273 +0000 UTC m=+14.648249073,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.367260 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c2a0722df5fb7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a0722df5fb7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.415334839 +0000 UTC m=+3.783961192,LastTimestamp:2026-03-12 18:02:54.585949675 +0000 UTC m=+14.954576048,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.371419 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c2a072dc99b9e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a072dc99b9e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.598457758 +0000 UTC m=+3.967084101,LastTimestamp:2026-03-12 18:02:54.842625546 +0000 UTC m=+15.211251879,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.375663 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c2a072e7bd965\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a072e7bd965 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:43.610138981 +0000 UTC m=+3.978765324,LastTimestamp:2026-03-12 18:02:54.855078029 +0000 UTC m=+15.223704382,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.380564 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-apiserver-crc.189c2a09d19b0151 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 18:03:45 crc kubenswrapper[4926]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 18:03:45 crc kubenswrapper[4926]: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.936801617 +0000 UTC m=+15.305427960,LastTimestamp:2026-03-12 18:02:54.936801617 +0000 UTC m=+15.305427960,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.384332 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a09d19c026f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.936867439 +0000 UTC m=+15.305493772,LastTimestamp:2026-03-12 18:02:54.936867439 +0000 UTC m=+15.305493772,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.389040 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c2a09d19b0151\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-apiserver-crc.189c2a09d19b0151 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 18:03:45 crc kubenswrapper[4926]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 18:03:45 crc kubenswrapper[4926]: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.936801617 +0000 UTC m=+15.305427960,LastTimestamp:2026-03-12 18:02:54.943599313 +0000 UTC m=+15.312225656,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.393957 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c2a09d19c026f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c2a09d19c026f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.936867439 +0000 UTC m=+15.305493772,LastTimestamp:2026-03-12 18:02:54.943653735 +0000 UTC m=+15.312280068,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.400910 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a09aa6e4922\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6e4922 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 18:03:45 crc kubenswrapper[4926]: body: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.279559458 +0000 UTC m=+14.648185801,LastTimestamp:2026-03-12 18:03:04.279514677 +0000 UTC m=+24.648141010,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.405596 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a09aa6f404a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6f404a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.27962273 +0000 UTC m=+14.648249073,LastTimestamp:2026-03-12 18:03:04.279782097 +0000 UTC m=+24.648408430,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.409509 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-controller-manager-crc.189c2a0e1571ef35 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:46140->192.168.126.11:10357: read: connection reset by peer Mar 12 18:03:45 crc kubenswrapper[4926]: body: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:03:13.254829877 +0000 UTC m=+33.623456220,LastTimestamp:2026-03-12 18:03:13.254829877 +0000 UTC m=+33.623456220,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.412903 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a0e1572babb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:46140->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:03:13.254881979 +0000 UTC m=+33.623508312,LastTimestamp:2026-03-12 18:03:13.254881979 +0000 UTC m=+33.623508312,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.418317 4926 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a0e15954f65 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:03:13.257148261 +0000 UTC m=+33.625774594,LastTimestamp:2026-03-12 18:03:13.257148261 +0000 UTC m=+33.625774594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.420542 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a06b4e37904\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06b4e37904 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.5701097 +0000 UTC m=+1.938736043,LastTimestamp:2026-03-12 18:03:13.775076741 +0000 UTC m=+34.143703074,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: I0312 18:03:45.422406 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.422561 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a06c74187ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06c74187ff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.878263807 +0000 UTC m=+2.246890180,LastTimestamp:2026-03-12 18:03:13.982368969 +0000 UTC m=+34.350995342,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.424164 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a06c7ece494\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a06c7ece494 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:41.889494164 +0000 UTC m=+2.258120527,LastTimestamp:2026-03-12 18:03:13.994832701 +0000 UTC m=+34.363459084,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.426943 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a09aa6e4922\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6e4922 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 18:03:45 crc kubenswrapper[4926]: body: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.279559458 +0000 UTC m=+14.648185801,LastTimestamp:2026-03-12 18:03:24.280145796 +0000 UTC m=+44.648772219,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.430721 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a09aa6f404a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6f404a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.27962273 +0000 UTC m=+14.648249073,LastTimestamp:2026-03-12 18:03:24.280240098 +0000 UTC m=+44.648866471,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:03:45 crc kubenswrapper[4926]: E0312 18:03:45.435362 4926 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c2a09aa6e4922\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 18:03:45 crc kubenswrapper[4926]: &Event{ObjectMeta:{kube-controller-manager-crc.189c2a09aa6e4922 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 18:03:45 crc kubenswrapper[4926]: body: Mar 12 18:03:45 crc kubenswrapper[4926]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:02:54.279559458 +0000 UTC m=+14.648185801,LastTimestamp:2026-03-12 18:03:34.279890169 +0000 UTC m=+54.648516512,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 18:03:45 crc kubenswrapper[4926]: > Mar 12 18:03:45 crc kubenswrapper[4926]: I0312 18:03:45.758110 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 18:03:45 crc kubenswrapper[4926]: I0312 18:03:45.760110 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:45 crc kubenswrapper[4926]: I0312 18:03:45.761205 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:45 crc kubenswrapper[4926]: I0312 18:03:45.761290 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:45 crc kubenswrapper[4926]: I0312 18:03:45.761317 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:46 crc kubenswrapper[4926]: I0312 18:03:46.427311 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.429016 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.489650 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.490985 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.491020 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.491030 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.491672 4926 scope.go:117] "RemoveContainer" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.573218 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.573423 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.574754 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.574790 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.574801 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.766807 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.768330 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab"} Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.768493 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.769429 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.769476 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:47 crc kubenswrapper[4926]: I0312 18:03:47.769487 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:48 crc kubenswrapper[4926]: I0312 18:03:48.426919 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.427295 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.777504 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.778761 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.780867 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" exitCode=255 Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.780914 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab"} Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.780957 4926 scope.go:117] "RemoveContainer" containerID="237a0c1a00f49cb1e15478c5454fa052180c11a579cce5309cb7213bd78dff97" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.781063 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.781816 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.781839 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.781848 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:49 crc kubenswrapper[4926]: I0312 18:03:49.782305 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:03:49 crc kubenswrapper[4926]: E0312 18:03:49.782487 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.370007 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.371982 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.372259 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.372351 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.372478 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:50 crc kubenswrapper[4926]: E0312 18:03:50.377894 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 18:03:50 crc kubenswrapper[4926]: E0312 18:03:50.378942 4926 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.427379 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:50 crc kubenswrapper[4926]: E0312 18:03:50.565263 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:03:50 crc kubenswrapper[4926]: I0312 18:03:50.784885 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.229156 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.229407 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.230585 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.230617 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.230627 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.231081 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:03:51 crc kubenswrapper[4926]: E0312 18:03:51.231244 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.279852 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.280070 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.281088 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.281115 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.281128 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.329244 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.337296 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.426508 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.712682 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.741361 4926 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.788858 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.788901 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790095 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790118 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790126 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790188 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790204 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790214 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:51 crc kubenswrapper[4926]: I0312 18:03:51.790741 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:03:51 crc kubenswrapper[4926]: E0312 18:03:51.790909 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:03:52 crc kubenswrapper[4926]: I0312 18:03:52.427316 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:53 crc kubenswrapper[4926]: I0312 18:03:53.426735 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:54 crc kubenswrapper[4926]: I0312 18:03:54.428152 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:55 crc kubenswrapper[4926]: I0312 18:03:55.429840 4926 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:55 crc kubenswrapper[4926]: W0312 18:03:55.711958 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 12 18:03:55 crc kubenswrapper[4926]: E0312 18:03:55.712029 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 18:03:55 crc kubenswrapper[4926]: W0312 18:03:55.880685 4926 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 12 18:03:55 crc kubenswrapper[4926]: E0312 18:03:55.880741 4926 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 18:03:56 crc kubenswrapper[4926]: I0312 18:03:56.025652 4926 csr.go:261] certificate signing request csr-f7zxj is approved, waiting to be issued Mar 12 18:03:56 crc kubenswrapper[4926]: I0312 18:03:56.037176 4926 csr.go:257] certificate signing request csr-f7zxj is issued Mar 12 18:03:56 crc kubenswrapper[4926]: I0312 18:03:56.051921 4926 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 18:03:56 crc kubenswrapper[4926]: I0312 18:03:56.250761 4926 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.038531 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-19 20:51:38.749459399 +0000 UTC Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.038590 4926 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7514h47m41.710878482s for next certificate rotation Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.379648 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.380842 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.380891 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.380901 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.380989 4926 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.389591 4926 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.389914 4926 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.389949 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.396589 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.396627 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.396636 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.396650 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.396659 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:03:57Z","lastTransitionTime":"2026-03-12T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.414907 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.425912 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.425955 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.425965 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.425979 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.425990 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:03:57Z","lastTransitionTime":"2026-03-12T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.436768 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.445901 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.445956 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.445970 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.445990 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.446004 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:03:57Z","lastTransitionTime":"2026-03-12T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.460406 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.468299 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.468346 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.468356 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.468372 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.468383 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:03:57Z","lastTransitionTime":"2026-03-12T18:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.483679 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.484044 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.484104 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.576308 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.576532 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.577840 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.577908 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:03:57 crc kubenswrapper[4926]: I0312 18:03:57.577926 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.584427 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.684810 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.785359 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.886527 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:57 crc kubenswrapper[4926]: E0312 18:03:57.987644 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.088073 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.188777 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.289497 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.390002 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.490787 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.591885 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.692628 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.793598 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.894270 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:58 crc kubenswrapper[4926]: E0312 18:03:58.995150 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.095722 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.195863 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.296380 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.397559 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.497702 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.599255 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.699498 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.800246 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:03:59 crc kubenswrapper[4926]: E0312 18:03:59.901208 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.001830 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.102937 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.203931 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.304594 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.405049 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.505720 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.565732 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.606153 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.706977 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.807866 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:00 crc kubenswrapper[4926]: E0312 18:04:00.908604 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.008911 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.110002 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.211095 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.311628 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.411755 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.512761 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.613029 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.713933 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.814356 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:01 crc kubenswrapper[4926]: E0312 18:04:01.915413 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.016349 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.117302 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.218225 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.318635 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.418814 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.519228 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.619341 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.720319 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.821065 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:02 crc kubenswrapper[4926]: E0312 18:04:02.921867 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.022309 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.123401 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.223927 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.324694 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.425129 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.525326 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.625764 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.725953 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.827154 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:03 crc kubenswrapper[4926]: E0312 18:04:03.928304 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.029525 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.129806 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.230389 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.331533 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.431687 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: I0312 18:04:04.489612 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:04:04 crc kubenswrapper[4926]: I0312 18:04:04.490828 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:04 crc kubenswrapper[4926]: I0312 18:04:04.490860 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:04 crc kubenswrapper[4926]: I0312 18:04:04.490868 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:04 crc kubenswrapper[4926]: I0312 18:04:04.491373 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.491552 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.532006 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.632120 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.732262 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.832386 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:04 crc kubenswrapper[4926]: E0312 18:04:04.933713 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.034493 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.135542 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.236785 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.337883 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.438045 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.538967 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.639432 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.740478 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.841523 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:05 crc kubenswrapper[4926]: E0312 18:04:05.942680 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.043362 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.143683 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.244712 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.344946 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.445661 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.546335 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.646543 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.747104 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.848211 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:06 crc kubenswrapper[4926]: E0312 18:04:06.949245 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.050144 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.151173 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.251347 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.351844 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.452518 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.552638 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.653329 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.753795 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.854195 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.873573 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.879875 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.879988 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.880017 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.880057 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.880086 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:07Z","lastTransitionTime":"2026-03-12T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.898876 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.910614 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.910865 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.910941 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.911036 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.911107 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:07Z","lastTransitionTime":"2026-03-12T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.922196 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.933768 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.933935 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.934040 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.934086 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.934123 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:07Z","lastTransitionTime":"2026-03-12T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.951276 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.962949 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.962994 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.963005 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.963030 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:07 crc kubenswrapper[4926]: I0312 18:04:07.963041 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:07Z","lastTransitionTime":"2026-03-12T18:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.975038 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.975179 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:04:07 crc kubenswrapper[4926]: E0312 18:04:07.975207 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.075725 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.176643 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.276754 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.377855 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.478213 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.578748 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.678848 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.779597 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.880014 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:08 crc kubenswrapper[4926]: E0312 18:04:08.980770 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.081488 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.182339 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.283512 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.384587 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.485777 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.586992 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.687902 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.789103 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.889697 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:09 crc kubenswrapper[4926]: E0312 18:04:09.990805 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.091969 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.192171 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.292818 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.393521 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: I0312 18:04:10.489386 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:04:10 crc kubenswrapper[4926]: I0312 18:04:10.491244 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:10 crc kubenswrapper[4926]: I0312 18:04:10.491318 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:10 crc kubenswrapper[4926]: I0312 18:04:10.491341 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.494376 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.566160 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.594551 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.695057 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.795563 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.896535 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:10 crc kubenswrapper[4926]: E0312 18:04:10.997692 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.098776 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.199947 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.300260 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.400950 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.502045 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.603013 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.704224 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.805607 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:11 crc kubenswrapper[4926]: E0312 18:04:11.906759 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.007519 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.108104 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.209017 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.309396 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.410338 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.510456 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.610768 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.711869 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.812619 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:12 crc kubenswrapper[4926]: E0312 18:04:12.913726 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.014846 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.115929 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.217124 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.318211 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.418683 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.519500 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.619640 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.720699 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.821052 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:13 crc kubenswrapper[4926]: E0312 18:04:13.921665 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.022350 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.123306 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.224229 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.324595 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.424723 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.524825 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.625369 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.725729 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.826194 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:14 crc kubenswrapper[4926]: E0312 18:04:14.926904 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.028026 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.128504 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.229085 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.329519 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.430168 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.530360 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.631038 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.731998 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.832345 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:15 crc kubenswrapper[4926]: E0312 18:04:15.933332 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.034311 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.135505 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.235588 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.336087 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.436592 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: I0312 18:04:16.489881 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:04:16 crc kubenswrapper[4926]: I0312 18:04:16.491024 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:16 crc kubenswrapper[4926]: I0312 18:04:16.491083 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:16 crc kubenswrapper[4926]: I0312 18:04:16.491096 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:16 crc kubenswrapper[4926]: I0312 18:04:16.491759 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.491926 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 18:04:16 crc kubenswrapper[4926]: I0312 18:04:16.514585 4926 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.537341 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.637760 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.738222 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.839378 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:16 crc kubenswrapper[4926]: E0312 18:04:16.939946 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.040820 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.141979 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.242307 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.342876 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.443682 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.544489 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.644589 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.744895 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.845605 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:17 crc kubenswrapper[4926]: E0312 18:04:17.946759 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.047907 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.148383 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.248527 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.349018 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.350152 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.354510 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.354575 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.354584 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.354600 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.354609 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:18Z","lastTransitionTime":"2026-03-12T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.371074 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.375161 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.375211 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.375222 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.375240 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.375255 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:18Z","lastTransitionTime":"2026-03-12T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.389503 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.394099 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.394145 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.394179 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.394197 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.394207 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:18Z","lastTransitionTime":"2026-03-12T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.404101 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.408481 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.408543 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.408555 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.408570 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:18 crc kubenswrapper[4926]: I0312 18:04:18.408579 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:18Z","lastTransitionTime":"2026-03-12T18:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.418834 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.418945 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.449150 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.549609 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.650108 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.750254 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.850839 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:18 crc kubenswrapper[4926]: E0312 18:04:18.951320 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.052297 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.152501 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.253403 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.353751 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.454586 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.555238 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.656413 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.757402 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.857821 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:19 crc kubenswrapper[4926]: E0312 18:04:19.958756 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.059120 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.160019 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.260324 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.361000 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.461155 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.562136 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.566304 4926 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.663240 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.763583 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.864802 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:20 crc kubenswrapper[4926]: E0312 18:04:20.965237 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.065555 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.166570 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.267648 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.368748 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.469058 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.570035 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.670392 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.771232 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.871708 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:21 crc kubenswrapper[4926]: E0312 18:04:21.972241 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.072427 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.173093 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.273506 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.373970 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.474390 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.574749 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.675856 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.776895 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.877773 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:22 crc kubenswrapper[4926]: E0312 18:04:22.978091 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.078307 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.179303 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.280275 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.380940 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.482113 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.582635 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.683762 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.784044 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.884667 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:23 crc kubenswrapper[4926]: E0312 18:04:23.984826 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.085839 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.186111 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.287165 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.387664 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.488864 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.589542 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.690104 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.790596 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.891363 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:24 crc kubenswrapper[4926]: E0312 18:04:24.991964 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.092691 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.193650 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.294759 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.395582 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: I0312 18:04:25.489904 4926 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 18:04:25 crc kubenswrapper[4926]: I0312 18:04:25.493990 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:25 crc kubenswrapper[4926]: I0312 18:04:25.494033 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:25 crc kubenswrapper[4926]: I0312 18:04:25.494046 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.496349 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.596676 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.697115 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.797808 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.898702 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:25 crc kubenswrapper[4926]: E0312 18:04:25.999801 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.100653 4926 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.177151 4926 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.203941 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.203982 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.203992 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.204008 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.204020 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.306712 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.306776 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.306789 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.306805 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.306818 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.408887 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.408930 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.408941 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.408959 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.408971 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.440880 4926 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.466162 4926 apiserver.go:52] "Watching apiserver" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.470733 4926 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.471116 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-machine-config-operator/machine-config-daemon-hmdg8","openshift-multus/multus-xwqvl","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-dns/node-resolver-f9vxh","openshift-multus/multus-additional-cni-plugins-srh42","openshift-ovn-kubernetes/ovnkube-node-zlfmg"] Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.471547 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.471555 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.471605 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.471667 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.471679 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.471721 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.471738 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.472386 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.472571 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.472484 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.472954 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.473322 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.473936 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.474102 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.476888 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.476891 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477126 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477230 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477321 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477376 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477516 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477956 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.477984 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478117 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478138 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478218 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478241 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478316 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478336 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478428 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478578 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478666 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478684 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.478722 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479054 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479088 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479058 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479139 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479166 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479199 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479267 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479272 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479366 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479386 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.479488 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.501121 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.512026 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.512076 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.512087 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.512104 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.512115 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.514109 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.524140 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.530719 4926 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.537619 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.549003 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.557892 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.567129 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572372 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572538 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572667 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572805 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572957 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573091 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573198 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573291 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573383 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573515 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572714 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573763 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573703 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573814 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573826 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.572989 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573850 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573872 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573225 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573896 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573922 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573944 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573976 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574001 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574018 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574035 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574056 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574079 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574098 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573182 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573400 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573617 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573626 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.573654 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574090 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574118 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574245 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574291 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574242 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574342 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574411 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574448 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574456 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574480 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574506 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574531 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574554 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574575 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574596 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574619 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574728 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574780 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574804 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574826 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574849 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574875 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574898 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574921 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574943 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574965 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574989 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575011 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575036 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575058 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575080 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575103 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575126 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575147 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575169 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575190 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575211 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575233 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575255 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575277 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575299 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575321 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575345 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575366 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575388 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575414 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575439 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575502 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575525 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575546 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575567 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575589 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575640 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575661 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575683 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575705 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575728 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575750 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575772 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575796 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575818 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575843 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575869 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575894 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575916 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575938 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575960 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575981 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576004 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576028 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576051 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576074 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576096 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576118 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576141 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576161 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576184 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576222 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576245 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576267 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576297 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576337 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576379 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576408 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576446 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576502 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576538 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576567 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576589 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576613 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576638 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576684 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576712 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576745 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576774 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576806 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576833 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576861 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576901 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576950 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576996 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577048 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577101 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577136 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577169 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577202 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577235 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577268 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577301 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577337 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577374 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577419 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577540 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577593 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577644 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577678 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577718 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577753 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577789 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577827 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577878 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577928 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577966 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578001 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578034 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578072 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578108 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578140 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578171 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578205 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578555 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578604 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578642 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578695 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578770 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578759 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578824 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578860 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578894 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578927 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578960 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578995 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579056 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579122 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579157 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579189 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579223 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579258 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579292 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579330 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579368 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579403 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579443 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579531 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579570 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579605 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579640 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579675 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579718 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579787 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579823 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579858 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579892 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579926 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579961 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579997 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580033 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580068 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580103 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580137 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580172 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580211 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580247 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580285 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580324 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580359 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580438 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5a53ef4-c701-457f-9cf2-85819bf04d1a-cni-binary-copy\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580505 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-etc-kubernetes\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580541 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-slash\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580574 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-var-lib-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580609 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-env-overrides\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580643 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-cnibin\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580675 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-k8s-cni-cncf-io\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580708 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580743 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbt59\" (UniqueName: \"kubernetes.io/projected/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-kube-api-access-nbt59\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580775 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-netns\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580807 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-kubelet\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580840 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-systemd-units\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580873 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8t62\" (UniqueName: \"kubernetes.io/projected/f7b34559-da2f-4796-8f3f-c56b2725c464-kube-api-access-v8t62\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580907 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-socket-dir-parent\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580944 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-multus-certs\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580977 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7bdm\" (UniqueName: \"kubernetes.io/projected/594c806d-dd79-41ce-8e3a-a33d42bf0f7e-kube-api-access-x7bdm\") pod \"node-resolver-f9vxh\" (UID: \"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\") " pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581010 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cnibin\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581042 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581078 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581114 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-config\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581148 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f7b34559-da2f-4796-8f3f-c56b2725c464-rootfs\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581178 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-hostroot\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581210 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581242 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-script-lib\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581274 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-netns\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581305 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-netd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581362 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581401 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581447 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581500 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-conf-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581536 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-systemd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581569 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581610 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581647 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581681 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-os-release\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581807 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt6hw\" (UniqueName: \"kubernetes.io/projected/d5a53ef4-c701-457f-9cf2-85819bf04d1a-kube-api-access-bt6hw\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581977 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-system-cni-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582018 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-cni-bin\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582057 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582092 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-node-log\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582123 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovn-node-metrics-cert\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582163 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582200 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-cni-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582234 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-log-socket\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582273 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582312 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582449 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582509 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582551 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5dd\" (UniqueName: \"kubernetes.io/projected/bc33af41-5aa0-4254-ac75-69433d5f4ce9-kube-api-access-4t5dd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582589 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-kubelet\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582622 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-daemon-config\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582655 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-system-cni-dir\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582686 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-bin\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582722 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b34559-da2f-4796-8f3f-c56b2725c464-proxy-tls\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582766 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582889 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/594c806d-dd79-41ce-8e3a-a33d42bf0f7e-hosts-file\") pod \"node-resolver-f9vxh\" (UID: \"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\") " pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582958 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-ovn\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582999 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583035 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583070 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583106 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-cni-multus\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583150 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-etc-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583185 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7b34559-da2f-4796-8f3f-c56b2725c464-mcd-auth-proxy-config\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583225 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-os-release\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583306 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583331 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583355 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583377 4926 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583399 4926 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574480 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583526 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583548 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583696 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583705 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583756 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583804 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.584166 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.584449 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.584726 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.585054 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.585081 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.585104 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.586483 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.586586 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.586650 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.586985 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583422 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587073 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587105 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587131 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587157 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587181 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587208 4926 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587233 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587259 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574630 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574686 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574720 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574786 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587535 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.587714 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:27.087669238 +0000 UTC m=+107.456295631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587765 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587956 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.587835 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.588172 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.588840 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.588866 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.588823 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.589536 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:04:27.089522237 +0000 UTC m=+107.458148570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.589683 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.574849 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575063 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575233 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.575189 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576390 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576717 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576711 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576734 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.590111 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.576905 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577197 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577228 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577315 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577256 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577387 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577076 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577683 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.577994 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578000 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578030 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578231 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578521 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578542 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578647 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578628 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579203 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.578920 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579767 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579835 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.579894 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.580761 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581017 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581052 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581359 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581372 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581415 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581435 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581641 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581713 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581735 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581870 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.581871 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582200 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582287 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582327 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582744 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.582880 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583062 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583074 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583290 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583333 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.583392 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.589992 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.590140 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.590432 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.590445 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.590512 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591022 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591095 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591135 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591221 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.591335 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:27.091254171 +0000 UTC m=+107.459880564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.590735 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591612 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591650 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591670 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.591932 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.592227 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.592269 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.592552 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.592560 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.592877 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.592957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.593064 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.593566 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.593572 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.593630 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.593723 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.593842 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594050 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594066 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594170 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594230 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594342 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594388 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594431 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.594621 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.595371 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.595737 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.595727 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.595940 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.596122 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.596126 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.596337 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.596503 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.596517 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.596937 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.597334 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.598177 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.598508 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.599077 4926 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.599730 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.600262 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.600279 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.602496 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.603199 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.603541 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.603791 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.604022 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.604729 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.604875 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.605291 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.605287 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.605856 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.605936 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.605950 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.605980 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.605997 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.606138 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:27.106119479 +0000 UTC m=+107.474745842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.606398 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.607004 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.607389 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.607715 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.609969 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611257 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611344 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611427 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611641 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611712 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611708 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.611843 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.612131 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.612203 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.612394 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614157 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614636 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614698 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614710 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614726 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614736 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614856 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.614861 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.615613 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.616983 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.617012 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.617027 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:26 crc kubenswrapper[4926]: E0312 18:04:26.617085 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:27.117061894 +0000 UTC m=+107.485688307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.617499 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.618747 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.618771 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.618921 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.618971 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.619086 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.619108 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.619143 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.619666 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.621112 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.621574 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.621801 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.621933 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.623184 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.623537 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.623419 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.624178 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.624235 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.624330 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.624455 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.627761 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.628397 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.628662 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.629611 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.629867 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.642027 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.645215 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.654647 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.655214 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.669054 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.671966 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.673564 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.677764 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688339 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5dd\" (UniqueName: \"kubernetes.io/projected/bc33af41-5aa0-4254-ac75-69433d5f4ce9-kube-api-access-4t5dd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688378 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-kubelet\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688400 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-daemon-config\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688422 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-system-cni-dir\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688445 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-bin\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688483 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b34559-da2f-4796-8f3f-c56b2725c464-proxy-tls\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688498 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-cni-multus\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688513 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/594c806d-dd79-41ce-8e3a-a33d42bf0f7e-hosts-file\") pod \"node-resolver-f9vxh\" (UID: \"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\") " pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688528 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-ovn\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688554 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-os-release\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688566 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-etc-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688580 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7b34559-da2f-4796-8f3f-c56b2725c464-mcd-auth-proxy-config\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688594 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5a53ef4-c701-457f-9cf2-85819bf04d1a-cni-binary-copy\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688612 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-etc-kubernetes\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688879 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-slash\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688926 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-var-lib-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688959 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-env-overrides\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688988 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-kubelet\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689057 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-ovn\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689057 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-cnibin\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689075 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-cni-multus\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689102 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-k8s-cni-cncf-io\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689105 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-cnibin\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689153 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-k8s-cni-cncf-io\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689244 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/594c806d-dd79-41ce-8e3a-a33d42bf0f7e-hosts-file\") pod \"node-resolver-f9vxh\" (UID: \"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\") " pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689283 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-etc-kubernetes\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689335 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-slash\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689354 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-os-release\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689384 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-bin\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.688624 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-system-cni-dir\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689417 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-etc-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689489 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-var-lib-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689634 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689757 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbt59\" (UniqueName: \"kubernetes.io/projected/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-kube-api-access-nbt59\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689804 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-netns\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689888 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-kubelet\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689928 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-systemd-units\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.689969 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8t62\" (UniqueName: \"kubernetes.io/projected/f7b34559-da2f-4796-8f3f-c56b2725c464-kube-api-access-v8t62\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690065 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-config\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690107 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-socket-dir-parent\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690114 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-daemon-config\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690248 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-multus-certs\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690290 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7bdm\" (UniqueName: \"kubernetes.io/projected/594c806d-dd79-41ce-8e3a-a33d42bf0f7e-kube-api-access-x7bdm\") pod \"node-resolver-f9vxh\" (UID: \"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\") " pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690353 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cnibin\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690493 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-env-overrides\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690582 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-systemd-units\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690590 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-netns\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690661 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690688 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690697 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-multus-certs\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690771 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f7b34559-da2f-4796-8f3f-c56b2725c464-rootfs\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690834 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-hostroot\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.690937 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691041 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-script-lib\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691132 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-netns\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691215 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-netd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691315 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-run-netns\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691401 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-netd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691476 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f7b34559-da2f-4796-8f3f-c56b2725c464-rootfs\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691546 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-openvswitch\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691616 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691707 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5a53ef4-c701-457f-9cf2-85819bf04d1a-cni-binary-copy\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691826 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-kubelet\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691843 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691877 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-hostroot\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691936 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cnibin\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691937 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-script-lib\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691966 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-socket-dir-parent\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692026 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-conf-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.691986 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-conf-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692092 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692135 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-os-release\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692163 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-systemd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692220 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692287 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-systemd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692316 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692317 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692358 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-os-release\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692381 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692428 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7b34559-da2f-4796-8f3f-c56b2725c464-mcd-auth-proxy-config\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692490 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt6hw\" (UniqueName: \"kubernetes.io/projected/d5a53ef4-c701-457f-9cf2-85819bf04d1a-kube-api-access-bt6hw\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.692529 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-system-cni-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694180 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-config\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694250 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-cni-bin\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694315 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-host-var-lib-cni-bin\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694319 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-system-cni-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694363 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694546 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-node-log\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694630 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-node-log\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694696 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovn-node-metrics-cert\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.694766 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-cni-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695664 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-log-socket\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695712 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-log-socket\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695672 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5a53ef4-c701-457f-9cf2-85819bf04d1a-multus-cni-dir\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695741 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695868 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695963 4926 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695983 4926 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.695998 4926 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696017 4926 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696029 4926 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696041 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696062 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696075 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696088 4926 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696087 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696099 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696156 4926 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696174 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696195 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696238 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696253 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696267 4926 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696285 4926 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696299 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696311 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696324 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696341 4926 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696354 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696366 4926 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696378 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696394 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696406 4926 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696417 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696462 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696475 4926 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696488 4926 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696501 4926 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696517 4926 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696530 4926 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696585 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696634 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696656 4926 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696669 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696680 4926 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696695 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696717 4926 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696728 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696797 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696835 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696847 4926 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696859 4926 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696888 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696900 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696911 4926 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696922 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696936 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696947 4926 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696957 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696969 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696978 4926 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.696989 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697002 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697017 4926 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697029 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697039 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697049 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697061 4926 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697071 4926 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697081 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697093 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697107 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697116 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697127 4926 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697144 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697155 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697167 4926 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697180 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697199 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697227 4926 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697239 4926 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697250 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697264 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697274 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697284 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697298 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697310 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697325 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697337 4926 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697352 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697361 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697371 4926 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697380 4926 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697393 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697402 4926 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697412 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697423 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697444 4926 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697470 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697494 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697511 4926 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697522 4926 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697533 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697546 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697561 4926 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697571 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697582 4926 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697601 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697614 4926 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697632 4926 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697642 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697656 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697669 4926 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697679 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697689 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697710 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697722 4926 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697733 4926 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697744 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697765 4926 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697774 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697784 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697793 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697807 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697817 4926 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697827 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697839 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697850 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697861 4926 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697875 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697887 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697899 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697912 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697922 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697936 4926 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697947 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697956 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697967 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697979 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697989 4926 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.697999 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698011 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698021 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698035 4926 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698044 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698055 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698064 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698074 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698088 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698103 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698104 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b34559-da2f-4796-8f3f-c56b2725c464-proxy-tls\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698112 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698187 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698207 4926 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698237 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698255 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698269 4926 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698282 4926 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698299 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698310 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698323 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698339 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698351 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698363 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698375 4926 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698401 4926 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698414 4926 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698425 4926 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698449 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698489 4926 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698503 4926 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698518 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698535 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698548 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698561 4926 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698573 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698588 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698601 4926 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698613 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698625 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698666 4926 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698681 4926 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698695 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698711 4926 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698779 4926 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698799 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698812 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698832 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.698957 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovn-node-metrics-cert\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.704687 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5dd\" (UniqueName: \"kubernetes.io/projected/bc33af41-5aa0-4254-ac75-69433d5f4ce9-kube-api-access-4t5dd\") pod \"ovnkube-node-zlfmg\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.706223 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbt59\" (UniqueName: \"kubernetes.io/projected/1d37aa11-8fa5-4eb3-8edd-6f71523623b5-kube-api-access-nbt59\") pod \"multus-additional-cni-plugins-srh42\" (UID: \"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\") " pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.708197 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt6hw\" (UniqueName: \"kubernetes.io/projected/d5a53ef4-c701-457f-9cf2-85819bf04d1a-kube-api-access-bt6hw\") pod \"multus-xwqvl\" (UID: \"d5a53ef4-c701-457f-9cf2-85819bf04d1a\") " pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.712362 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8t62\" (UniqueName: \"kubernetes.io/projected/f7b34559-da2f-4796-8f3f-c56b2725c464-kube-api-access-v8t62\") pod \"machine-config-daemon-hmdg8\" (UID: \"f7b34559-da2f-4796-8f3f-c56b2725c464\") " pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.715631 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7bdm\" (UniqueName: \"kubernetes.io/projected/594c806d-dd79-41ce-8e3a-a33d42bf0f7e-kube-api-access-x7bdm\") pod \"node-resolver-f9vxh\" (UID: \"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\") " pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.716902 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.716927 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.716936 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.716950 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.716960 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.787372 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.795678 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.806349 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.813215 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-72f0695c2edf73654195d549a62c8fcccfaf8fc5295e2c261ac95fcbdc4e578b WatchSource:0}: Error finding container 72f0695c2edf73654195d549a62c8fcccfaf8fc5295e2c261ac95fcbdc4e578b: Status 404 returned error can't find the container with id 72f0695c2edf73654195d549a62c8fcccfaf8fc5295e2c261ac95fcbdc4e578b Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.816397 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.820482 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.820515 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.820526 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.820543 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.820557 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.824728 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xwqvl" Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.826097 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8d53cdc4e48c4d7f3859e37e80091956822e666ab97637824a53ad05c3b41920 WatchSource:0}: Error finding container 8d53cdc4e48c4d7f3859e37e80091956822e666ab97637824a53ad05c3b41920: Status 404 returned error can't find the container with id 8d53cdc4e48c4d7f3859e37e80091956822e666ab97637824a53ad05c3b41920 Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.832055 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.839705 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f9vxh" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.846775 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-srh42" Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.847267 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b34559_da2f_4796_8f3f_c56b2725c464.slice/crio-f4b2e4a544758c1263364c0df7e07d77c8bc4370a470d837ea6efae07158abf9 WatchSource:0}: Error finding container f4b2e4a544758c1263364c0df7e07d77c8bc4370a470d837ea6efae07158abf9: Status 404 returned error can't find the container with id f4b2e4a544758c1263364c0df7e07d77c8bc4370a470d837ea6efae07158abf9 Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.853638 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a53ef4_c701_457f_9cf2_85819bf04d1a.slice/crio-ad259f0c6d06a3faa766bd42496c02cc7bc98b9a5ebc84ff9eeb4ed17814b7e7 WatchSource:0}: Error finding container ad259f0c6d06a3faa766bd42496c02cc7bc98b9a5ebc84ff9eeb4ed17814b7e7: Status 404 returned error can't find the container with id ad259f0c6d06a3faa766bd42496c02cc7bc98b9a5ebc84ff9eeb4ed17814b7e7 Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.873820 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod594c806d_dd79_41ce_8e3a_a33d42bf0f7e.slice/crio-12064d7968642d5fcd14005252eca5815ef36389939f49bd05f0edaa57a27ba8 WatchSource:0}: Error finding container 12064d7968642d5fcd14005252eca5815ef36389939f49bd05f0edaa57a27ba8: Status 404 returned error can't find the container with id 12064d7968642d5fcd14005252eca5815ef36389939f49bd05f0edaa57a27ba8 Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.881896 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1f450f062448c5e8fdb1b02bd5b83f974900c9bf14e3078dff971588b7fe4341"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.883898 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f9vxh" event={"ID":"594c806d-dd79-41ce-8e3a-a33d42bf0f7e","Type":"ContainerStarted","Data":"12064d7968642d5fcd14005252eca5815ef36389939f49bd05f0edaa57a27ba8"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.885291 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"f4b2e4a544758c1263364c0df7e07d77c8bc4370a470d837ea6efae07158abf9"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.886478 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"72f0695c2edf73654195d549a62c8fcccfaf8fc5295e2c261ac95fcbdc4e578b"} Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.890087 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerStarted","Data":"ad259f0c6d06a3faa766bd42496c02cc7bc98b9a5ebc84ff9eeb4ed17814b7e7"} Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.891179 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc33af41_5aa0_4254_ac75_69433d5f4ce9.slice/crio-30c6b754be03f1fd88325819e9237821f35a5d4ca5f0ad8545574e86e65cadf9 WatchSource:0}: Error finding container 30c6b754be03f1fd88325819e9237821f35a5d4ca5f0ad8545574e86e65cadf9: Status 404 returned error can't find the container with id 30c6b754be03f1fd88325819e9237821f35a5d4ca5f0ad8545574e86e65cadf9 Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.891761 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8d53cdc4e48c4d7f3859e37e80091956822e666ab97637824a53ad05c3b41920"} Mar 12 18:04:26 crc kubenswrapper[4926]: W0312 18:04:26.895337 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d37aa11_8fa5_4eb3_8edd_6f71523623b5.slice/crio-6834c4d42bb80ef9b2cbc21221a1f2bd8626677d5b221112fb9686702721bfff WatchSource:0}: Error finding container 6834c4d42bb80ef9b2cbc21221a1f2bd8626677d5b221112fb9686702721bfff: Status 404 returned error can't find the container with id 6834c4d42bb80ef9b2cbc21221a1f2bd8626677d5b221112fb9686702721bfff Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.924022 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.924070 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.924083 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.924103 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:26 crc kubenswrapper[4926]: I0312 18:04:26.924116 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:26Z","lastTransitionTime":"2026-03-12T18:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.026504 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.026548 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.026557 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.026570 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.026579 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.102027 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.102154 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.102186 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.102272 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.102339 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:28.102313075 +0000 UTC m=+108.470939418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.102476 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.102553 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:28.102534653 +0000 UTC m=+108.471160986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.102618 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:04:28.102573804 +0000 UTC m=+108.471200137 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.128904 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.128939 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.128950 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.128966 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.128978 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.202900 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.202970 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203071 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203085 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203095 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203106 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203133 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203140 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:28.20312775 +0000 UTC m=+108.571754083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203147 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:27 crc kubenswrapper[4926]: E0312 18:04:27.203195 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:28.203179082 +0000 UTC m=+108.571805415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.231715 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.231761 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.231772 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.231791 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.231803 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.334860 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.334902 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.334917 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.334933 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.334944 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.437670 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.437722 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.437733 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.437750 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.437763 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.540861 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.540906 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.540918 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.540935 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.540949 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.643263 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.643298 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.643306 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.643319 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.643327 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.745663 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.745726 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.745747 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.745775 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.745796 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.848770 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.848834 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.848854 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.848880 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.848898 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.900702 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.900750 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.904042 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.904074 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.907259 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.910578 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerStarted","Data":"54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.912514 4926 generic.go:334] "Generic (PLEG): container finished" podID="1d37aa11-8fa5-4eb3-8edd-6f71523623b5" containerID="b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818" exitCode=0 Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.912580 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerDied","Data":"b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.912601 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerStarted","Data":"6834c4d42bb80ef9b2cbc21221a1f2bd8626677d5b221112fb9686702721bfff"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.915983 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" exitCode=0 Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.916037 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.916053 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"30c6b754be03f1fd88325819e9237821f35a5d4ca5f0ad8545574e86e65cadf9"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.918533 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f9vxh" event={"ID":"594c806d-dd79-41ce-8e3a-a33d42bf0f7e","Type":"ContainerStarted","Data":"690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.929995 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.952654 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.953672 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.953707 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.953721 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.953740 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.953754 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:27Z","lastTransitionTime":"2026-03-12T18:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.968971 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:27 crc kubenswrapper[4926]: I0312 18:04:27.989582 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.004608 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.021831 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.043155 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.055734 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.055777 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.055788 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.055805 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.055818 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.066271 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.080090 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.096825 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.108880 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.111696 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.111886 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.111937 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.111993 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:04:30.111968621 +0000 UTC m=+110.480595104 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.111999 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.112044 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.112067 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:30.112057304 +0000 UTC m=+110.480683857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.112111 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:30.112091645 +0000 UTC m=+110.480717978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.120236 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.133408 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.146015 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.158241 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.158276 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.158287 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.158303 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.158314 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.160081 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.171280 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.181937 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.192290 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.211040 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.212389 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.212462 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212568 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212587 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212599 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212635 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:30.212622241 +0000 UTC m=+110.581248574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212568 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212836 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212846 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.212872 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:30.212863488 +0000 UTC m=+110.581489821 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.229673 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.242115 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.253676 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.260248 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.260286 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.260296 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.260313 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.260322 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.362644 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.362690 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.362706 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.362728 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.362744 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.469288 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.469687 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.469699 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.469717 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.469731 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.489643 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.489757 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.489811 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.489882 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.495419 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.495699 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.499252 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.500307 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.501672 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.502424 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.503500 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.504134 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.504862 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.506424 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.513772 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.514606 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.516976 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.518040 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.519208 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.520052 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.520685 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.521849 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.522750 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.523822 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.524718 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.525726 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.526831 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.527547 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.529347 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.530193 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.530831 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.532051 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.533309 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.533941 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.534737 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.536059 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.536581 4926 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.536696 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.538747 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.538781 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.538789 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.538803 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.538813 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.538977 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.539635 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.540379 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.542019 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.542798 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.543850 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.544648 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.545945 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.546595 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.548297 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.549112 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.550364 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.550970 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.552062 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.552746 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.554071 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.554724 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.555896 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.556585 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.557256 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.557738 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.559140 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.559991 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.563823 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.563882 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.563902 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.563925 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.563941 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.582331 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.586734 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.586767 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.586779 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.586796 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.586808 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.598320 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.602663 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.602721 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.602737 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.602761 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.602777 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.616067 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.620589 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.620627 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.620639 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.620655 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.620664 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.635679 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: E0312 18:04:28.635823 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.637282 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.637341 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.637351 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.637366 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.637375 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.739843 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.740314 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.740325 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.740346 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.740358 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.841993 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.842022 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.842030 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.842042 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.842050 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.933345 4926 generic.go:334] "Generic (PLEG): container finished" podID="1d37aa11-8fa5-4eb3-8edd-6f71523623b5" containerID="73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101" exitCode=0 Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.933428 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerDied","Data":"73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.939710 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.939742 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.939754 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.939763 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.939773 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.939781 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.944591 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.944663 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.944675 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.944713 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.944726 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:28Z","lastTransitionTime":"2026-03-12T18:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.946189 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.960142 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.976246 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:28 crc kubenswrapper[4926]: I0312 18:04:28.994533 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.006976 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.021779 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.035519 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.049094 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.050550 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.050600 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.050615 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.050634 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.050645 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.069882 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.087326 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.100706 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.153288 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.153335 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.153349 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.153368 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.153380 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.256484 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.256550 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.256568 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.256592 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.256610 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.359118 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.359177 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.359193 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.359218 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.359236 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.462095 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.462159 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.462183 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.462212 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.462235 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.502810 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.502906 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.565590 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.565637 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.565649 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.565664 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.565676 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.668144 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.668190 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.668200 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.668213 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.668222 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.770734 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.770793 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.770815 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.770846 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.770869 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.873835 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.873902 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.873925 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.873957 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.873985 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.946375 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.950832 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.951021 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.956777 4926 generic.go:334] "Generic (PLEG): container finished" podID="1d37aa11-8fa5-4eb3-8edd-6f71523623b5" containerID="099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e" exitCode=0 Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.956836 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerDied","Data":"099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.970635 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.982933 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.983020 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.983044 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.983074 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.983099 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:29Z","lastTransitionTime":"2026-03-12T18:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.984379 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4gmrt"] Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.984794 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.987224 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:29Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.987545 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.987787 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.987946 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 18:04:29 crc kubenswrapper[4926]: I0312 18:04:29.998391 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.005181 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.028239 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.048639 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.072086 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.095670 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.095705 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.095715 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.095730 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.095740 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.104312 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.122822 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.131494 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.131637 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-serviceca\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.131668 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:04:34.131645455 +0000 UTC m=+114.500271788 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.131708 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.131737 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-host\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.131765 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.131792 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvtd\" (UniqueName: \"kubernetes.io/projected/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-kube-api-access-fmvtd\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.131830 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.131882 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.131915 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:34.131899572 +0000 UTC m=+114.500525905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.131929 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:34.131922933 +0000 UTC m=+114.500549266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.142584 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.155617 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.171288 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.184200 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.194293 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.198004 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.198032 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.198041 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.198056 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.198066 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.207212 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.220497 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.230094 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.230947 4926 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.232441 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.232498 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-serviceca\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.232536 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232542 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232560 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.232562 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-host\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232570 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.232594 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvtd\" (UniqueName: \"kubernetes.io/projected/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-kube-api-access-fmvtd\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232607 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:34.232595914 +0000 UTC m=+114.601222247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.232683 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-host\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232728 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232757 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232770 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.232828 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:34.23280676 +0000 UTC m=+114.601433083 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.233533 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-serviceca\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.242561 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.261292 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.262879 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvtd\" (UniqueName: \"kubernetes.io/projected/dcfdbe34-faf7-4306-a2d8-6e95715f4f2a-kube-api-access-fmvtd\") pod \"node-ca-4gmrt\" (UID: \"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\") " pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.280799 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.300316 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.300347 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.300356 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.300370 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.300380 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.304627 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.314988 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4gmrt" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.318116 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.329975 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: W0312 18:04:30.334992 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcfdbe34_faf7_4306_a2d8_6e95715f4f2a.slice/crio-a534e02499142e9db4e8c9bd3e0a002d9b687e211c85fa0d45f09ee3e7adb767 WatchSource:0}: Error finding container a534e02499142e9db4e8c9bd3e0a002d9b687e211c85fa0d45f09ee3e7adb767: Status 404 returned error can't find the container with id a534e02499142e9db4e8c9bd3e0a002d9b687e211c85fa0d45f09ee3e7adb767 Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.342396 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.354051 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.366050 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.402038 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.402113 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.402127 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.402143 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.402155 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.489665 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.489810 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.490044 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.490143 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.490227 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:30 crc kubenswrapper[4926]: E0312 18:04:30.490369 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.504004 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.504051 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.504068 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.504089 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.504104 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.506059 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.519961 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.534291 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.548360 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.567125 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.587074 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.605941 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.605991 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.606001 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.606018 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.606030 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.608824 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.628616 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.645021 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.656925 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.668525 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.677403 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.684934 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.707926 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.707970 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.707981 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.707995 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.708006 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.810495 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.810545 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.810562 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.810585 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.810601 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.913432 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.913493 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.913504 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.913521 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.913532 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:30Z","lastTransitionTime":"2026-03-12T18:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.963726 4926 generic.go:334] "Generic (PLEG): container finished" podID="1d37aa11-8fa5-4eb3-8edd-6f71523623b5" containerID="5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2" exitCode=0 Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.963801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerDied","Data":"5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.965801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gmrt" event={"ID":"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a","Type":"ContainerStarted","Data":"aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.965854 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4gmrt" event={"ID":"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a","Type":"ContainerStarted","Data":"a534e02499142e9db4e8c9bd3e0a002d9b687e211c85fa0d45f09ee3e7adb767"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.967944 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359"} Mar 12 18:04:30 crc kubenswrapper[4926]: I0312 18:04:30.983510 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.004063 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.016158 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.016236 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.016258 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.016281 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.016298 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.018571 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.032394 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.045127 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.055753 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.075407 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.088614 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.100199 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.115036 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.119561 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.119614 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.119624 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.119640 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.119650 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.128332 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.144777 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.157889 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.170763 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.182134 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.203550 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.215737 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.221596 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.221625 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.221636 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.221650 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.221662 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.231000 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.242424 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.253717 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.266123 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.276208 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.287252 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.298244 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.313419 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.323987 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.324016 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.324024 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.324037 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.324045 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.328311 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.426338 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.426394 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.426412 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.426445 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.426510 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.528913 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.528969 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.528986 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.529009 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.529026 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.631367 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.631477 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.631495 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.631518 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.631535 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.734325 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.734365 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.734376 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.734390 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.734400 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.837212 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.837629 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.837650 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.837676 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.837693 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.940502 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.940571 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.940589 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.940613 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.940629 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:31Z","lastTransitionTime":"2026-03-12T18:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.973210 4926 generic.go:334] "Generic (PLEG): container finished" podID="1d37aa11-8fa5-4eb3-8edd-6f71523623b5" containerID="4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a" exitCode=0 Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.973316 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerDied","Data":"4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.978547 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} Mar 12 18:04:31 crc kubenswrapper[4926]: I0312 18:04:31.991585 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:31Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.009587 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.032926 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.043019 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.043056 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.043066 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.043082 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.043092 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.051847 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.062386 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.076792 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.089275 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.101344 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.111432 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.122191 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.133833 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.145128 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.146053 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.146089 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.146099 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.146113 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.146122 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.157811 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:32Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.248122 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.248163 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.248173 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.248188 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.248199 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.351055 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.351511 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.351751 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.351947 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.352126 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.455605 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.455663 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.455682 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.455710 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.455726 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.489638 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.489679 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.489713 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:32 crc kubenswrapper[4926]: E0312 18:04:32.489789 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:32 crc kubenswrapper[4926]: E0312 18:04:32.489927 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:32 crc kubenswrapper[4926]: E0312 18:04:32.490041 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.558144 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.558188 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.558198 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.558214 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.558224 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.660914 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.660957 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.660969 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.660986 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.660997 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.763762 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.763800 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.763811 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.763827 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.763839 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.869876 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.869928 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.869938 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.869963 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.869973 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.972598 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.972683 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.972708 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.972738 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.972763 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:32Z","lastTransitionTime":"2026-03-12T18:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.987394 4926 generic.go:334] "Generic (PLEG): container finished" podID="1d37aa11-8fa5-4eb3-8edd-6f71523623b5" containerID="a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef" exitCode=0 Mar 12 18:04:32 crc kubenswrapper[4926]: I0312 18:04:32.987483 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerDied","Data":"a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.009463 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.030648 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.051651 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.067782 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.074870 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.074921 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.074934 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.074951 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.074964 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.080229 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.093364 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.102035 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.112995 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.121489 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.136618 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.149992 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.165521 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.177184 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:33Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.177252 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.177287 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.177298 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.177311 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.177321 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.280003 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.280051 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.280064 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.280083 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.280095 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.383467 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.383513 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.383527 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.383547 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.383559 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.486543 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.486592 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.486607 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.486629 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.486644 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.590910 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.590962 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.590980 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.591001 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.591021 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.694170 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.695008 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.695026 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.695085 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.695103 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.797729 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.797803 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.797821 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.797848 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.797868 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.900272 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.900328 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.900339 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.900361 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.900377 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:33Z","lastTransitionTime":"2026-03-12T18:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:33 crc kubenswrapper[4926]: I0312 18:04:33.995763 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" event={"ID":"1d37aa11-8fa5-4eb3-8edd-6f71523623b5","Type":"ContainerStarted","Data":"eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.000573 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.000869 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.000912 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.000923 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.002027 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.002063 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.002072 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.002088 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.002096 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.011657 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.029373 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.042545 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.094998 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.095073 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.095288 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.107824 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.107866 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.107875 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.107889 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.107898 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.109174 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.122288 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.132955 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.143567 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.151391 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.161348 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.170412 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.173731 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.173855 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:04:42.173831671 +0000 UTC m=+122.542458024 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.174185 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.174324 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:42.174310116 +0000 UTC m=+122.542936509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.174324 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.174629 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.174736 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.174788 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:42.17477204 +0000 UTC m=+122.543398443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.181330 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.193028 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.210555 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.210593 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.210602 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.210616 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.210627 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.212138 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.232997 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.244092 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.254705 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.272710 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.275166 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.275206 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275306 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275326 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275337 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275367 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275389 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275402 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275377 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:42.275365338 +0000 UTC m=+122.643991671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.275481 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:42.275467241 +0000 UTC m=+122.644093574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.286258 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.297781 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.311688 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.312681 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.312722 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.312733 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.312750 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.312760 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.324994 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.338198 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.350537 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.360656 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.370086 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:34Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.415200 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.415242 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.415251 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.415265 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.415275 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.489020 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.489133 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.489037 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.489257 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.489398 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:34 crc kubenswrapper[4926]: E0312 18:04:34.489465 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.517703 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.517766 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.517785 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.517809 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.517826 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.620282 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.620338 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.620363 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.620393 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.620409 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.724003 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.724098 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.724115 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.724138 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.724155 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.827891 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.827999 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.828016 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.828045 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.828064 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.931356 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.931500 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.931522 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.931550 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:34 crc kubenswrapper[4926]: I0312 18:04:34.931570 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:34Z","lastTransitionTime":"2026-03-12T18:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.033186 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.033240 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.033251 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.033268 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.033284 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.136262 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.136353 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.136369 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.136392 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.136407 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.239243 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.239286 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.239321 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.239338 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.239349 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.341971 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.342041 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.342063 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.342091 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.342112 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.445339 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.445410 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.445433 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.445498 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.445522 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.549127 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.549194 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.549220 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.549250 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.549274 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.652207 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.652259 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.652278 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.652302 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.652319 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.755820 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.755878 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.755899 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.755925 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.755945 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.858850 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.858883 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.858894 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.858910 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.858921 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.961221 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.961252 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.961262 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.961279 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.961292 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:35Z","lastTransitionTime":"2026-03-12T18:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.993103 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc"] Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.993639 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.997013 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 18:04:35 crc kubenswrapper[4926]: I0312 18:04:35.997062 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.013932 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.032390 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.047970 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.063503 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.063723 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.063801 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.063960 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.064066 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.064145 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.082746 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.095507 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12de8a94-72e6-4d72-8e39-42f3ef9d1125-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.095575 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12de8a94-72e6-4d72-8e39-42f3ef9d1125-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.095599 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12de8a94-72e6-4d72-8e39-42f3ef9d1125-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.095695 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwn2v\" (UniqueName: \"kubernetes.io/projected/12de8a94-72e6-4d72-8e39-42f3ef9d1125-kube-api-access-vwn2v\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.095879 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.107892 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.118621 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.129654 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.140675 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.151407 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.166484 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.166536 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.166549 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.166565 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.166577 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.169075 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.179704 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.193809 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.196209 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12de8a94-72e6-4d72-8e39-42f3ef9d1125-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.196270 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12de8a94-72e6-4d72-8e39-42f3ef9d1125-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.196290 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12de8a94-72e6-4d72-8e39-42f3ef9d1125-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.196326 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwn2v\" (UniqueName: \"kubernetes.io/projected/12de8a94-72e6-4d72-8e39-42f3ef9d1125-kube-api-access-vwn2v\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.197068 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12de8a94-72e6-4d72-8e39-42f3ef9d1125-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.197194 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12de8a94-72e6-4d72-8e39-42f3ef9d1125-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.202269 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12de8a94-72e6-4d72-8e39-42f3ef9d1125-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.217269 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwn2v\" (UniqueName: \"kubernetes.io/projected/12de8a94-72e6-4d72-8e39-42f3ef9d1125-kube-api-access-vwn2v\") pod \"ovnkube-control-plane-749d76644c-fq9dc\" (UID: \"12de8a94-72e6-4d72-8e39-42f3ef9d1125\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.275257 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.275298 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.275313 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.275336 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.275353 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.309375 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" Mar 12 18:04:36 crc kubenswrapper[4926]: W0312 18:04:36.325021 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12de8a94_72e6_4d72_8e39_42f3ef9d1125.slice/crio-f84fc990f02e81fbca9e4f4d9a207c3bbefbfb7135fc031b4ce43b6a9f2d9125 WatchSource:0}: Error finding container f84fc990f02e81fbca9e4f4d9a207c3bbefbfb7135fc031b4ce43b6a9f2d9125: Status 404 returned error can't find the container with id f84fc990f02e81fbca9e4f4d9a207c3bbefbfb7135fc031b4ce43b6a9f2d9125 Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.383845 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.383907 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.383919 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.383939 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.383951 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.486596 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.486651 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.486712 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.486735 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.486748 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.488871 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.488893 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.488867 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:36 crc kubenswrapper[4926]: E0312 18:04:36.488999 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:36 crc kubenswrapper[4926]: E0312 18:04:36.489077 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:36 crc kubenswrapper[4926]: E0312 18:04:36.489208 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.589475 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.589869 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.589881 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.589903 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.589914 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.693024 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.693088 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.693109 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.693134 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.693152 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.714656 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-n7pd7"] Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.715577 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:36 crc kubenswrapper[4926]: E0312 18:04:36.715707 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.733808 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.753962 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.783155 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.795363 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.795612 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.795697 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.795818 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.795910 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.800242 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.801683 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6bnm\" (UniqueName: \"kubernetes.io/projected/211eeae6-9b41-484b-bd13-99c1c28cdf96-kube-api-access-n6bnm\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.801728 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.809310 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.821953 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.836737 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.847070 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.860099 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.880582 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.898627 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.898675 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.898713 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.898731 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.898743 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:36Z","lastTransitionTime":"2026-03-12T18:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.900107 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.902692 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6bnm\" (UniqueName: \"kubernetes.io/projected/211eeae6-9b41-484b-bd13-99c1c28cdf96-kube-api-access-n6bnm\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.902789 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:36 crc kubenswrapper[4926]: E0312 18:04:36.902949 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:36 crc kubenswrapper[4926]: E0312 18:04:36.903029 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:37.403005018 +0000 UTC m=+117.771631391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.917689 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.932630 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.940697 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6bnm\" (UniqueName: \"kubernetes.io/projected/211eeae6-9b41-484b-bd13-99c1c28cdf96-kube-api-access-n6bnm\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.944713 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:36 crc kubenswrapper[4926]: I0312 18:04:36.954600 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:36Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.001371 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.001721 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.001831 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.001948 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.002211 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.010843 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" event={"ID":"12de8a94-72e6-4d72-8e39-42f3ef9d1125","Type":"ContainerStarted","Data":"f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.010881 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" event={"ID":"12de8a94-72e6-4d72-8e39-42f3ef9d1125","Type":"ContainerStarted","Data":"9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.010892 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" event={"ID":"12de8a94-72e6-4d72-8e39-42f3ef9d1125","Type":"ContainerStarted","Data":"f84fc990f02e81fbca9e4f4d9a207c3bbefbfb7135fc031b4ce43b6a9f2d9125"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.013598 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/0.log" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.017992 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115" exitCode=1 Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.018053 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.019123 4926 scope.go:117] "RemoveContainer" containerID="be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.030537 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.043785 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.067578 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.083369 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.097281 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.105746 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.105788 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.105800 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.105816 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.105827 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.123468 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.138967 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.150476 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.165655 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.177369 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.187059 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.203849 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.207529 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.207551 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.207560 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.207575 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.207586 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.219765 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.232088 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.244036 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.256696 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.269298 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.282593 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.295059 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.305144 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.309837 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.309874 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.309888 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.309904 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.309915 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.317655 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.329879 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.347049 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"I0312 18:04:36.449007 6832 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 18:04:36.449037 6832 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 18:04:36.449512 6832 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 18:04:36.449524 6832 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 18:04:36.449541 6832 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 18:04:36.449629 6832 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:36.449657 6832 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:36.449688 6832 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:36.449696 6832 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:36.449714 6832 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 18:04:36.449719 6832 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:36.449738 6832 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:36.449746 6832 factory.go:656] Stopping watch factory\\\\nI0312 18:04:36.449751 6832 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:36.449761 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:36.449740 6832 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.361122 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.374909 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.386990 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.397112 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.404882 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.411672 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:37 crc kubenswrapper[4926]: E0312 18:04:37.411793 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:37 crc kubenswrapper[4926]: E0312 18:04:37.411843 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:38.411828051 +0000 UTC m=+118.780454384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.412789 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.412833 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.412842 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.412861 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.412873 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.416944 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.432890 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:37Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.514847 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.514902 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.514914 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.514932 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.514943 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.627646 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.627688 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.627699 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.627716 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.627728 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.730918 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.730989 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.731006 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.731034 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.731051 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.833565 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.833604 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.833621 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.833636 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.833646 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.936282 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.936317 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.936325 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.936341 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:37 crc kubenswrapper[4926]: I0312 18:04:37.936350 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:37Z","lastTransitionTime":"2026-03-12T18:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.024171 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/0.log" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.028861 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.029474 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.039466 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.039507 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.039521 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.039541 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.039557 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.043584 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.062553 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.080560 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.099824 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"I0312 18:04:36.449007 6832 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 18:04:36.449037 6832 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 18:04:36.449512 6832 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 18:04:36.449524 6832 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 18:04:36.449541 6832 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 18:04:36.449629 6832 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:36.449657 6832 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:36.449688 6832 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:36.449696 6832 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:36.449714 6832 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 18:04:36.449719 6832 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:36.449738 6832 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:36.449746 6832 factory.go:656] Stopping watch factory\\\\nI0312 18:04:36.449751 6832 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:36.449761 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:36.449740 6832 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.121814 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.135940 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.141177 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.141228 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.141239 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.141255 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.141266 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.153196 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.166685 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.178850 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.193125 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.203068 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.215361 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.225675 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.238647 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.244025 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.244058 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.244071 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.244086 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.244099 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.249774 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.346339 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.346392 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.346404 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.346421 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.346432 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.432583 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.432753 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.432866 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:40.432838124 +0000 UTC m=+120.801464497 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.450257 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.450326 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.450345 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.450372 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.450391 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.489800 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.490010 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.490046 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.490071 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.490142 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.490283 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.490415 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.490533 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.502878 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.553580 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.553656 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.553667 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.553681 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.553691 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.652212 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.652304 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.652325 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.652349 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.652364 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.671267 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.681257 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.681394 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.681432 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.681523 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.681544 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.702652 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.707396 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.707510 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.707535 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.707566 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.707588 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.727639 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.731766 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.731856 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.731883 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.731898 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.731911 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.744686 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.748814 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.748840 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.748851 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.748866 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.748899 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.762564 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:38Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:38 crc kubenswrapper[4926]: E0312 18:04:38.762714 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.763962 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.763995 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.764004 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.764020 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.764030 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.866663 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.866709 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.866720 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.866738 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.866749 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.970214 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.970247 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.970256 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.970272 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:38 crc kubenswrapper[4926]: I0312 18:04:38.970284 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:38Z","lastTransitionTime":"2026-03-12T18:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.034891 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/1.log" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.035788 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/0.log" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.038938 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572" exitCode=1 Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.039036 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.039109 4926 scope.go:117] "RemoveContainer" containerID="be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.041065 4926 scope.go:117] "RemoveContainer" containerID="3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572" Mar 12 18:04:39 crc kubenswrapper[4926]: E0312 18:04:39.041289 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.064815 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.072565 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.072606 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.072618 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.072635 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.072649 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.089765 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.105439 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.118328 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.135717 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.153122 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.169118 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.174371 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.174400 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.174411 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.174431 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.174462 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.188863 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6eea7ebb0023aaf587a20ffee0cf7dd08281f2265ae735dcb8e3486dfc2115\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"I0312 18:04:36.449007 6832 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0312 18:04:36.449037 6832 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0312 18:04:36.449512 6832 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0312 18:04:36.449524 6832 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0312 18:04:36.449541 6832 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0312 18:04:36.449629 6832 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:36.449657 6832 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:36.449688 6832 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:36.449696 6832 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:36.449714 6832 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0312 18:04:36.449719 6832 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:36.449738 6832 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:36.449746 6832 factory.go:656] Stopping watch factory\\\\nI0312 18:04:36.449751 6832 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:36.449761 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:36.449740 6832 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.203157 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.212664 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.227188 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.238817 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.254119 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.264226 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.277000 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.277153 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.277185 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.277199 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.277217 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.277229 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.287782 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:39Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.379912 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.379973 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.379990 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.380014 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.380031 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.482751 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.482798 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.482813 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.482828 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.482839 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.585146 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.585201 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.585214 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.585231 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.585242 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.689016 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.689115 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.689135 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.689588 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.689811 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.792936 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.793013 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.793035 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.793062 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.793079 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.895206 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.895253 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.895263 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.895275 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.895285 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.998577 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.998634 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.998654 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.998677 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:39 crc kubenswrapper[4926]: I0312 18:04:39.998693 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:39Z","lastTransitionTime":"2026-03-12T18:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.045435 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/1.log" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.050702 4926 scope.go:117] "RemoveContainer" containerID="3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.050866 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.073765 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.095028 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.103350 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.103379 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.103388 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.103401 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.103410 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:40Z","lastTransitionTime":"2026-03-12T18:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.115215 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.132970 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.155134 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.170509 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.185991 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.202246 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.206471 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.206518 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.206535 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.206558 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.206579 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:40Z","lastTransitionTime":"2026-03-12T18:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.222656 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.291872 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.306364 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.309484 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.309521 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.309533 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.309549 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.309560 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:40Z","lastTransitionTime":"2026-03-12T18:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.323146 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.336804 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.350167 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.361719 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.373944 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.412212 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.412256 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.412265 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.412278 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.412305 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:40Z","lastTransitionTime":"2026-03-12T18:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.476068 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.476219 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.476284 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:44.476266905 +0000 UTC m=+124.844893258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.492507 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.492657 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.493042 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.493125 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.493188 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.493250 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.493308 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.493380 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.512482 4926 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.512584 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.529097 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.544403 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.556736 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.573288 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: E0312 18:04:40.577752 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.589832 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.606160 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.625148 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.642691 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.655058 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.669906 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.683737 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.695858 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.705378 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.715879 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:40 crc kubenswrapper[4926]: I0312 18:04:40.726560 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.234894 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.250358 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.264339 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.279534 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.297485 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.312923 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.336089 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.374253 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.386112 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.400391 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.414290 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.444510 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.465265 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.485878 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.501027 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.514291 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:41 crc kubenswrapper[4926]: I0312 18:04:41.528762 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.188650 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.188877 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:04:58.188839358 +0000 UTC m=+138.557465731 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.189121 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.189232 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.189377 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.189543 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:58.18952182 +0000 UTC m=+138.558148233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.189396 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.189740 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:58.189711565 +0000 UTC m=+138.558337978 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.290751 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.290894 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.290986 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291023 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291036 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291087 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291106 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:58.291084878 +0000 UTC m=+138.659711211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291116 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291136 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.291225 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:58.291185481 +0000 UTC m=+138.659811904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.489608 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.490060 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.489695 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.489731 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:42 crc kubenswrapper[4926]: I0312 18:04:42.489671 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.490587 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.490328 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:42 crc kubenswrapper[4926]: E0312 18:04:42.490714 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:44 crc kubenswrapper[4926]: I0312 18:04:44.489506 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:44 crc kubenswrapper[4926]: I0312 18:04:44.489533 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:44 crc kubenswrapper[4926]: E0312 18:04:44.489671 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:44 crc kubenswrapper[4926]: E0312 18:04:44.489798 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:44 crc kubenswrapper[4926]: I0312 18:04:44.489596 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:44 crc kubenswrapper[4926]: I0312 18:04:44.489905 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:44 crc kubenswrapper[4926]: E0312 18:04:44.489953 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:44 crc kubenswrapper[4926]: E0312 18:04:44.490062 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:44 crc kubenswrapper[4926]: I0312 18:04:44.514997 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:44 crc kubenswrapper[4926]: E0312 18:04:44.515201 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:44 crc kubenswrapper[4926]: E0312 18:04:44.515280 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:04:52.515256662 +0000 UTC m=+132.883883035 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:45 crc kubenswrapper[4926]: E0312 18:04:45.579138 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:04:46 crc kubenswrapper[4926]: I0312 18:04:46.489620 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:46 crc kubenswrapper[4926]: I0312 18:04:46.489689 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:46 crc kubenswrapper[4926]: E0312 18:04:46.490186 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:46 crc kubenswrapper[4926]: I0312 18:04:46.489784 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:46 crc kubenswrapper[4926]: I0312 18:04:46.489746 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:46 crc kubenswrapper[4926]: E0312 18:04:46.490621 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:46 crc kubenswrapper[4926]: E0312 18:04:46.490771 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:46 crc kubenswrapper[4926]: E0312 18:04:46.490832 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:48 crc kubenswrapper[4926]: I0312 18:04:48.489874 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:48 crc kubenswrapper[4926]: I0312 18:04:48.489893 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:48 crc kubenswrapper[4926]: I0312 18:04:48.490010 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:48 crc kubenswrapper[4926]: I0312 18:04:48.489967 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:48 crc kubenswrapper[4926]: E0312 18:04:48.490144 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:48 crc kubenswrapper[4926]: E0312 18:04:48.490304 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:48 crc kubenswrapper[4926]: E0312 18:04:48.490420 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:48 crc kubenswrapper[4926]: E0312 18:04:48.490599 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.134292 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.134353 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.134371 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.134394 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.134411 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:49Z","lastTransitionTime":"2026-03-12T18:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:49 crc kubenswrapper[4926]: E0312 18:04:49.150244 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:49Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.157753 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.157786 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.157795 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.157808 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.157816 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:49Z","lastTransitionTime":"2026-03-12T18:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:49 crc kubenswrapper[4926]: E0312 18:04:49.178023 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:49Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.182848 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.182887 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.182900 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.182916 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.182926 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:49Z","lastTransitionTime":"2026-03-12T18:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:49 crc kubenswrapper[4926]: E0312 18:04:49.198483 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:49Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.203153 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.203205 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.203215 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.203235 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.203250 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:49Z","lastTransitionTime":"2026-03-12T18:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:49 crc kubenswrapper[4926]: E0312 18:04:49.219599 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:49Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.223369 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.223426 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.223467 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.223492 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:49 crc kubenswrapper[4926]: I0312 18:04:49.223509 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:49Z","lastTransitionTime":"2026-03-12T18:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:49 crc kubenswrapper[4926]: E0312 18:04:49.237005 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:49Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:49 crc kubenswrapper[4926]: E0312 18:04:49.237200 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.489189 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.489253 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.489227 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.489204 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:50 crc kubenswrapper[4926]: E0312 18:04:50.489350 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:50 crc kubenswrapper[4926]: E0312 18:04:50.489521 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:50 crc kubenswrapper[4926]: E0312 18:04:50.489758 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:50 crc kubenswrapper[4926]: E0312 18:04:50.489824 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.500594 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.515533 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.529985 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.544020 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.558521 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.569343 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: E0312 18:04:50.580070 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.582260 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.599957 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.621746 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.637488 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.649228 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.666344 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.682252 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.701410 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.718479 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:50 crc kubenswrapper[4926]: I0312 18:04:50.738899 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:50Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:52 crc kubenswrapper[4926]: I0312 18:04:52.489624 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:52 crc kubenswrapper[4926]: I0312 18:04:52.489683 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:52 crc kubenswrapper[4926]: E0312 18:04:52.489795 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:52 crc kubenswrapper[4926]: I0312 18:04:52.489908 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:52 crc kubenswrapper[4926]: I0312 18:04:52.489971 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:52 crc kubenswrapper[4926]: E0312 18:04:52.490271 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:52 crc kubenswrapper[4926]: E0312 18:04:52.490401 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:52 crc kubenswrapper[4926]: E0312 18:04:52.490579 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:52 crc kubenswrapper[4926]: I0312 18:04:52.600660 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:52 crc kubenswrapper[4926]: E0312 18:04:52.600840 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:52 crc kubenswrapper[4926]: E0312 18:04:52.600906 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:05:08.600884861 +0000 UTC m=+148.969511224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:04:54 crc kubenswrapper[4926]: I0312 18:04:54.490006 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:54 crc kubenswrapper[4926]: I0312 18:04:54.490056 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:54 crc kubenswrapper[4926]: I0312 18:04:54.490104 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:54 crc kubenswrapper[4926]: I0312 18:04:54.490056 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:54 crc kubenswrapper[4926]: E0312 18:04:54.490245 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:54 crc kubenswrapper[4926]: E0312 18:04:54.490397 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:54 crc kubenswrapper[4926]: E0312 18:04:54.490576 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:54 crc kubenswrapper[4926]: E0312 18:04:54.490732 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:55 crc kubenswrapper[4926]: I0312 18:04:55.490849 4926 scope.go:117] "RemoveContainer" containerID="3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572" Mar 12 18:04:55 crc kubenswrapper[4926]: E0312 18:04:55.581783 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.108169 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/1.log" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.111111 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd"} Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.111479 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.122717 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.140131 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.156324 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.168615 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.185553 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.197793 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.215377 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.231110 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.245252 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.269573 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.289116 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.311672 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.359896 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.373558 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.385808 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.397410 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:56Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.489802 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.489857 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.489820 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:56 crc kubenswrapper[4926]: E0312 18:04:56.489919 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:56 crc kubenswrapper[4926]: E0312 18:04:56.490097 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:56 crc kubenswrapper[4926]: I0312 18:04:56.490186 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:56 crc kubenswrapper[4926]: E0312 18:04:56.490269 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:56 crc kubenswrapper[4926]: E0312 18:04:56.490353 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.117506 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/2.log" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.119007 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/1.log" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.123716 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd" exitCode=1 Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.123817 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd"} Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.123984 4926 scope.go:117] "RemoveContainer" containerID="3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.125071 4926 scope.go:117] "RemoveContainer" containerID="fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd" Mar 12 18:04:57 crc kubenswrapper[4926]: E0312 18:04:57.125387 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.145823 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.163393 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.181487 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.197293 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.211702 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.227732 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.241364 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.272425 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e57e6a3873367656206dd5f7b6b7302dd876f3735f7e7fc4e741cd61f966572\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:38Z\\\",\\\"message\\\":\\\"ft/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.056752 7017 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0312 18:04:38.057188 7017 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:38.057288 7017 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:38.057368 7017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0312 18:04:38.057393 7017 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:38.057474 7017 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:38.057564 7017 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0312 18:04:38.057640 7017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0312 18:04:38.057709 7017 factory.go:656] Stopping watch factory\\\\nI0312 18:04:38.057731 7017 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:38.057773 7017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0312 18:04:38.057798 7017 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:38.057895 7017 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.296536 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.312034 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.327249 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.340763 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.354736 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.365682 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.383122 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:57 crc kubenswrapper[4926]: I0312 18:04:57.397606 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:57Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.129815 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/2.log" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.134019 4926 scope.go:117] "RemoveContainer" containerID="fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.134266 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.154282 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.170716 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.189414 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.205056 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.217249 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.231832 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.250242 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.263759 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.271522 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.271938 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:05:30.271901526 +0000 UTC m=+170.640527869 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.272086 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.273731 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.273851 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.273730 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.273930 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:05:30.273884505 +0000 UTC m=+170.642510878 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.274129 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:05:30.274084411 +0000 UTC m=+170.642710774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.301730 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.323299 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.343002 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.359399 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.373065 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.375452 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.375589 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.375856 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.375951 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.376029 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.376144 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:05:30.376126794 +0000 UTC m=+170.744753127 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.376279 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.376311 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.378547 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.378683 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:05:30.378655379 +0000 UTC m=+170.747281752 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.385061 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.397289 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.407931 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:58Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.488964 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.489032 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.488993 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.489192 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.489329 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.489515 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:04:58 crc kubenswrapper[4926]: I0312 18:04:58.489905 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:04:58 crc kubenswrapper[4926]: E0312 18:04:58.490174 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.549381 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.549467 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.549488 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.549516 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.549535 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:59Z","lastTransitionTime":"2026-03-12T18:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:59 crc kubenswrapper[4926]: E0312 18:04:59.569760 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:59Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.574483 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.574553 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.574579 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.574611 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.574637 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:59Z","lastTransitionTime":"2026-03-12T18:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:59 crc kubenswrapper[4926]: E0312 18:04:59.595518 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:59Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.601346 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.601411 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.601435 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.601498 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.601520 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:59Z","lastTransitionTime":"2026-03-12T18:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:59 crc kubenswrapper[4926]: E0312 18:04:59.618326 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:59Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.623168 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.623223 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.623241 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.623263 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.623280 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:59Z","lastTransitionTime":"2026-03-12T18:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:59 crc kubenswrapper[4926]: E0312 18:04:59.637957 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:59Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.641912 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.641967 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.641984 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.642007 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:04:59 crc kubenswrapper[4926]: I0312 18:04:59.642024 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:04:59Z","lastTransitionTime":"2026-03-12T18:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:04:59 crc kubenswrapper[4926]: E0312 18:04:59.656927 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:04:59Z is after 2025-08-24T17:21:41Z" Mar 12 18:04:59 crc kubenswrapper[4926]: E0312 18:04:59.657161 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.489574 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.489698 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.489603 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:00 crc kubenswrapper[4926]: E0312 18:05:00.489813 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.489603 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:00 crc kubenswrapper[4926]: E0312 18:05:00.489945 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:00 crc kubenswrapper[4926]: E0312 18:05:00.490031 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:00 crc kubenswrapper[4926]: E0312 18:05:00.490131 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.505468 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.524303 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.543274 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.562911 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: E0312 18:05:00.583198 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.583167 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.605504 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.628182 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.643956 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.657974 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.671087 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.685288 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.696085 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.709898 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.722822 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.739959 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:00 crc kubenswrapper[4926]: I0312 18:05:00.759731 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:00Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:02 crc kubenswrapper[4926]: I0312 18:05:02.490067 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:02 crc kubenswrapper[4926]: E0312 18:05:02.491275 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:02 crc kubenswrapper[4926]: I0312 18:05:02.490171 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:02 crc kubenswrapper[4926]: E0312 18:05:02.491775 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:02 crc kubenswrapper[4926]: I0312 18:05:02.490122 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:02 crc kubenswrapper[4926]: I0312 18:05:02.490278 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:02 crc kubenswrapper[4926]: E0312 18:05:02.492176 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:02 crc kubenswrapper[4926]: E0312 18:05:02.492324 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:04 crc kubenswrapper[4926]: I0312 18:05:04.489330 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:04 crc kubenswrapper[4926]: E0312 18:05:04.489516 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:04 crc kubenswrapper[4926]: I0312 18:05:04.489578 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:04 crc kubenswrapper[4926]: E0312 18:05:04.489816 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:04 crc kubenswrapper[4926]: I0312 18:05:04.489863 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:04 crc kubenswrapper[4926]: E0312 18:05:04.489948 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:04 crc kubenswrapper[4926]: I0312 18:05:04.490582 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:04 crc kubenswrapper[4926]: E0312 18:05:04.490664 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:05 crc kubenswrapper[4926]: E0312 18:05:05.584472 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:06 crc kubenswrapper[4926]: I0312 18:05:06.489988 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:06 crc kubenswrapper[4926]: I0312 18:05:06.490015 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:06 crc kubenswrapper[4926]: I0312 18:05:06.490245 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:06 crc kubenswrapper[4926]: I0312 18:05:06.490264 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:06 crc kubenswrapper[4926]: E0312 18:05:06.491086 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:06 crc kubenswrapper[4926]: E0312 18:05:06.491628 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:06 crc kubenswrapper[4926]: E0312 18:05:06.491853 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:06 crc kubenswrapper[4926]: E0312 18:05:06.492024 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:08 crc kubenswrapper[4926]: I0312 18:05:08.489266 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:08 crc kubenswrapper[4926]: I0312 18:05:08.489267 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:08 crc kubenswrapper[4926]: I0312 18:05:08.489376 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:08 crc kubenswrapper[4926]: E0312 18:05:08.489521 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:08 crc kubenswrapper[4926]: E0312 18:05:08.489668 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:08 crc kubenswrapper[4926]: E0312 18:05:08.489833 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:08 crc kubenswrapper[4926]: I0312 18:05:08.491150 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:08 crc kubenswrapper[4926]: E0312 18:05:08.491516 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:08 crc kubenswrapper[4926]: E0312 18:05:08.687695 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:05:08 crc kubenswrapper[4926]: E0312 18:05:08.687930 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:05:40.68781576 +0000 UTC m=+181.056442133 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:05:08 crc kubenswrapper[4926]: I0312 18:05:08.688568 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.039027 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.039094 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.039112 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.039137 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.039155 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:10Z","lastTransitionTime":"2026-03-12T18:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.059579 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.064493 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.064586 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.064618 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.064655 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.064694 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:10Z","lastTransitionTime":"2026-03-12T18:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.087877 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.093841 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.093894 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.093911 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.093938 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.093955 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:10Z","lastTransitionTime":"2026-03-12T18:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.117353 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.122505 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.122578 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.122603 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.122632 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.122655 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:10Z","lastTransitionTime":"2026-03-12T18:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.141428 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.145856 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.145904 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.145920 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.145943 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.145960 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:10Z","lastTransitionTime":"2026-03-12T18:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.159836 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.159991 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.489680 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.489867 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.489908 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.489943 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.489985 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.490044 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.490523 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.490651 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.491245 4926 scope.go:117] "RemoveContainer" containerID="fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.491530 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.505671 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.532987 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.555223 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.568970 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: E0312 18:05:10.585715 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.589117 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.603299 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.614765 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.627776 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.642034 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.657501 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.675848 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.689737 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.705950 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.721477 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.748283 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:10 crc kubenswrapper[4926]: I0312 18:05:10.777249 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:10Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:12 crc kubenswrapper[4926]: I0312 18:05:12.488979 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:12 crc kubenswrapper[4926]: E0312 18:05:12.489195 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:12 crc kubenswrapper[4926]: I0312 18:05:12.489328 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:12 crc kubenswrapper[4926]: E0312 18:05:12.489521 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:12 crc kubenswrapper[4926]: I0312 18:05:12.489651 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:12 crc kubenswrapper[4926]: E0312 18:05:12.489788 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:12 crc kubenswrapper[4926]: I0312 18:05:12.489945 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:12 crc kubenswrapper[4926]: E0312 18:05:12.490199 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.191229 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/0.log" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.191311 4926 generic.go:334] "Generic (PLEG): container finished" podID="d5a53ef4-c701-457f-9cf2-85819bf04d1a" containerID="54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c" exitCode=1 Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.191362 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerDied","Data":"54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c"} Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.192013 4926 scope.go:117] "RemoveContainer" containerID="54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.210773 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.231582 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.250040 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.273155 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.286895 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.305691 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.322582 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.354087 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.376457 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.393499 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.412188 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.427136 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.441666 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.454018 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.465461 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.477801 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:14Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.489177 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.489307 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.489309 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:14 crc kubenswrapper[4926]: I0312 18:05:14.489215 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:14 crc kubenswrapper[4926]: E0312 18:05:14.489504 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:14 crc kubenswrapper[4926]: E0312 18:05:14.489651 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:14 crc kubenswrapper[4926]: E0312 18:05:14.489833 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:14 crc kubenswrapper[4926]: E0312 18:05:14.489988 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.197573 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/0.log" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.197650 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerStarted","Data":"bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4"} Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.214563 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.228783 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.244863 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.261409 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.275010 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.288975 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.302961 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.314807 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.329714 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.347795 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.367538 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.398188 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.421863 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.437920 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.455500 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: I0312 18:05:15.474897 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:15Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:15 crc kubenswrapper[4926]: E0312 18:05:15.587062 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:16 crc kubenswrapper[4926]: I0312 18:05:16.489747 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:16 crc kubenswrapper[4926]: I0312 18:05:16.489767 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:16 crc kubenswrapper[4926]: I0312 18:05:16.489750 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:16 crc kubenswrapper[4926]: I0312 18:05:16.489860 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:16 crc kubenswrapper[4926]: E0312 18:05:16.490068 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:16 crc kubenswrapper[4926]: E0312 18:05:16.490171 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:16 crc kubenswrapper[4926]: E0312 18:05:16.489967 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:16 crc kubenswrapper[4926]: E0312 18:05:16.490360 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:18 crc kubenswrapper[4926]: I0312 18:05:18.489517 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:18 crc kubenswrapper[4926]: I0312 18:05:18.489577 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:18 crc kubenswrapper[4926]: E0312 18:05:18.489735 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:18 crc kubenswrapper[4926]: I0312 18:05:18.489799 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:18 crc kubenswrapper[4926]: I0312 18:05:18.489518 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:18 crc kubenswrapper[4926]: E0312 18:05:18.489928 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:18 crc kubenswrapper[4926]: E0312 18:05:18.490080 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:18 crc kubenswrapper[4926]: E0312 18:05:18.490166 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.368329 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.368382 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.368395 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.368414 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.368482 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:20Z","lastTransitionTime":"2026-03-12T18:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.381284 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.385017 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.385066 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.385079 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.385096 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.385108 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:20Z","lastTransitionTime":"2026-03-12T18:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.400173 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.404122 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.404264 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.404374 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.404513 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.404633 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:20Z","lastTransitionTime":"2026-03-12T18:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.418004 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.426759 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.426807 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.426827 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.426844 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.426855 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:20Z","lastTransitionTime":"2026-03-12T18:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.442954 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.448365 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.448461 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.448480 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.448504 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.448522 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:20Z","lastTransitionTime":"2026-03-12T18:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.464313 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.464591 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.489550 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.489590 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.489651 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.489554 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.489782 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.489935 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.489996 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.490058 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.508912 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.525034 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.549098 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.566427 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.578391 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: E0312 18:05:20.587826 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.591352 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.606240 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.619334 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.633550 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.650257 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.669203 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.683920 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.699938 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.717854 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.734255 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:20 crc kubenswrapper[4926]: I0312 18:05:20.748570 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:20Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:22 crc kubenswrapper[4926]: I0312 18:05:22.489659 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:22 crc kubenswrapper[4926]: I0312 18:05:22.489683 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:22 crc kubenswrapper[4926]: I0312 18:05:22.489725 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:22 crc kubenswrapper[4926]: I0312 18:05:22.489744 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:22 crc kubenswrapper[4926]: E0312 18:05:22.492084 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:22 crc kubenswrapper[4926]: E0312 18:05:22.492265 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:22 crc kubenswrapper[4926]: E0312 18:05:22.492635 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:22 crc kubenswrapper[4926]: E0312 18:05:22.492830 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:24 crc kubenswrapper[4926]: I0312 18:05:24.489345 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:24 crc kubenswrapper[4926]: I0312 18:05:24.489409 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:24 crc kubenswrapper[4926]: E0312 18:05:24.489579 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:24 crc kubenswrapper[4926]: E0312 18:05:24.489707 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:24 crc kubenswrapper[4926]: I0312 18:05:24.489804 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:24 crc kubenswrapper[4926]: E0312 18:05:24.489888 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:24 crc kubenswrapper[4926]: I0312 18:05:24.490065 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:24 crc kubenswrapper[4926]: E0312 18:05:24.490168 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:25 crc kubenswrapper[4926]: I0312 18:05:25.491206 4926 scope.go:117] "RemoveContainer" containerID="fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd" Mar 12 18:05:25 crc kubenswrapper[4926]: E0312 18:05:25.589013 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.237336 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/2.log" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.240254 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.240855 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.260345 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.275877 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.294376 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.310529 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.323984 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.338486 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.353750 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.365109 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.377392 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.403395 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.418160 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.428796 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.453112 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.468448 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.483343 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.489225 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.489280 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.489286 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:26 crc kubenswrapper[4926]: E0312 18:05:26.489350 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:26 crc kubenswrapper[4926]: E0312 18:05:26.489466 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.489493 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:26 crc kubenswrapper[4926]: E0312 18:05:26.489595 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:26 crc kubenswrapper[4926]: E0312 18:05:26.489769 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:26 crc kubenswrapper[4926]: I0312 18:05:26.498494 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:26Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.245661 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/3.log" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.246721 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/2.log" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.249761 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" exitCode=1 Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.249818 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.249866 4926 scope.go:117] "RemoveContainer" containerID="fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.250959 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:05:27 crc kubenswrapper[4926]: E0312 18:05:27.251481 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.277619 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe14ac05f79c0314c60edfebe5cb903048be0cc438846a149ac872d84d8178cd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:04:56Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI0312 18:04:56.427089 7216 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0312 18:04:56.427156 7216 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0312 18:04:56.427246 7216 factory.go:1336] Added *v1.Node event handler 7\\\\nI0312 18:04:56.427294 7216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0312 18:04:56.427311 7216 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0312 18:04:56.427313 7216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0312 18:04:56.427343 7216 factory.go:656] Stopping watch factory\\\\nI0312 18:04:56.427360 7216 handler.go:208] Removed *v1.Node event handler 2\\\\nI0312 18:04:56.427397 7216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0312 18:04:56.427659 7216 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0312 18:04:56.427747 7216 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0312 18:04:56.427790 7216 ovnkube.go:599] Stopped ovnkube\\\\nI0312 18:04:56.427823 7216 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0312 18:04:56.427914 7216 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:26Z\\\",\\\"message\\\":\\\"(nil)\\\\nI0312 18:05:26.683779 7541 services_controller.go:444] Built service openshift-dns/dns-default LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"UDP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9154, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0312 18:05:26.683797 7541 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.300407 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.315379 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.336000 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.355902 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.375889 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.391508 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.410285 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.426000 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.443021 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.457738 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.471598 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.488656 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.499249 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.502818 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.509490 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.523935 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:27 crc kubenswrapper[4926]: I0312 18:05:27.537519 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:27Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.259722 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/3.log" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.264593 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:05:28 crc kubenswrapper[4926]: E0312 18:05:28.264911 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.278751 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c31076f-c495-4763-9ea0-58712f07bb6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405f59d0da6c9a3663ed746f08f9d5c2d94818971dbc0ce0373690c731b5afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 18:03:14.139597 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 18:03:14.141034 1 observer_polling.go:159] Starting file observer\\\\nI0312 18:03:14.142357 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 18:03:14.143332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 18:03:43.393935 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0312 18:03:44.289189 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 18:03:44.289250 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f33c0bfdb670b43186efb3e52df85915bd35749a127245356f71fe96994d85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ff9f1fe3b91c7273624abf9e138c54d1d2228edc8e5ff370cdcc3b8df4a7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.291134 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.300942 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467073c1-7776-4dca-9a24-1beb51b5775f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00032a5db7175b95edd80be3a15bca3e5bfee1c8bcc8bb2353ab3b620e12b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.316667 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.334429 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.352179 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.369324 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.389214 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.404234 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.419449 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.436640 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.453315 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.474825 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:26Z\\\",\\\"message\\\":\\\"(nil)\\\\nI0312 18:05:26.683779 7541 services_controller.go:444] Built service openshift-dns/dns-default LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"UDP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9154, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0312 18:05:26.683797 7541 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.489251 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.489304 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:28 crc kubenswrapper[4926]: E0312 18:05:28.489410 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.489505 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:28 crc kubenswrapper[4926]: E0312 18:05:28.489617 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.489250 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:28 crc kubenswrapper[4926]: E0312 18:05:28.489706 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:28 crc kubenswrapper[4926]: E0312 18:05:28.489793 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.491667 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.502254 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.515754 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.531773 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:28 crc kubenswrapper[4926]: I0312 18:05:28.544187 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:28Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.328264 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.328567 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:34.328531653 +0000 UTC m=+234.697158036 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.329534 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.329732 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.329737 4926 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.330096 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:06:34.33007683 +0000 UTC m=+234.698703193 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.329786 4926 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.330509 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 18:06:34.330491592 +0000 UTC m=+234.699117955 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.430845 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.430909 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431052 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431094 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431114 4926 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431140 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431168 4926 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431238 4926 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431187 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 18:06:34.431162905 +0000 UTC m=+234.799789268 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.431322 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 18:06:34.431298819 +0000 UTC m=+234.799925192 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.473241 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.473311 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.473334 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.473386 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.473409 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:30Z","lastTransitionTime":"2026-03-12T18:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.489729 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.489839 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.489918 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.489923 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.489974 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.490127 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.490281 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.490402 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.493331 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.502956 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.502991 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.503007 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.503021 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.503033 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:30Z","lastTransitionTime":"2026-03-12T18:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.509173 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.516506 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.521217 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.521290 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.521305 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.521320 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.521329 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:30Z","lastTransitionTime":"2026-03-12T18:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.526786 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.535147 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.539158 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.539197 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.539215 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.539236 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.539252 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:30Z","lastTransitionTime":"2026-03-12T18:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.541884 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.553699 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.557789 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.557840 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.557860 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.557885 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.557903 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:30Z","lastTransitionTime":"2026-03-12T18:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.563738 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:26Z\\\",\\\"message\\\":\\\"(nil)\\\\nI0312 18:05:26.683779 7541 services_controller.go:444] Built service openshift-dns/dns-default LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"UDP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9154, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0312 18:05:26.683797 7541 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.573139 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.573354 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.581499 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: E0312 18:05:30.590616 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.595885 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.614109 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.627093 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.644918 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.656793 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.668440 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.681583 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c31076f-c495-4763-9ea0-58712f07bb6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405f59d0da6c9a3663ed746f08f9d5c2d94818971dbc0ce0373690c731b5afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 18:03:14.139597 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 18:03:14.141034 1 observer_polling.go:159] Starting file observer\\\\nI0312 18:03:14.142357 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 18:03:14.143332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 18:03:43.393935 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0312 18:03:44.289189 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 18:03:44.289250 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f33c0bfdb670b43186efb3e52df85915bd35749a127245356f71fe96994d85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ff9f1fe3b91c7273624abf9e138c54d1d2228edc8e5ff370cdcc3b8df4a7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.691289 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.703367 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467073c1-7776-4dca-9a24-1beb51b5775f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00032a5db7175b95edd80be3a15bca3e5bfee1c8bcc8bb2353ab3b620e12b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.717615 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.736113 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.751509 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:30 crc kubenswrapper[4926]: I0312 18:05:30.765564 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:30Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:32 crc kubenswrapper[4926]: I0312 18:05:32.489360 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:32 crc kubenswrapper[4926]: I0312 18:05:32.489298 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:32 crc kubenswrapper[4926]: E0312 18:05:32.489890 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:32 crc kubenswrapper[4926]: I0312 18:05:32.489514 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:32 crc kubenswrapper[4926]: I0312 18:05:32.489478 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:32 crc kubenswrapper[4926]: E0312 18:05:32.489974 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:32 crc kubenswrapper[4926]: E0312 18:05:32.490154 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:32 crc kubenswrapper[4926]: E0312 18:05:32.490238 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:34 crc kubenswrapper[4926]: I0312 18:05:34.488956 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:34 crc kubenswrapper[4926]: I0312 18:05:34.489019 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:34 crc kubenswrapper[4926]: I0312 18:05:34.489007 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:34 crc kubenswrapper[4926]: I0312 18:05:34.488956 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:34 crc kubenswrapper[4926]: E0312 18:05:34.489209 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:34 crc kubenswrapper[4926]: E0312 18:05:34.489311 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:34 crc kubenswrapper[4926]: E0312 18:05:34.489558 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:34 crc kubenswrapper[4926]: E0312 18:05:34.489727 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:35 crc kubenswrapper[4926]: E0312 18:05:35.592356 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:36 crc kubenswrapper[4926]: I0312 18:05:36.489335 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:36 crc kubenswrapper[4926]: I0312 18:05:36.489413 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:36 crc kubenswrapper[4926]: E0312 18:05:36.489588 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:36 crc kubenswrapper[4926]: I0312 18:05:36.489627 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:36 crc kubenswrapper[4926]: I0312 18:05:36.489700 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:36 crc kubenswrapper[4926]: E0312 18:05:36.490002 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:36 crc kubenswrapper[4926]: E0312 18:05:36.490156 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:36 crc kubenswrapper[4926]: E0312 18:05:36.490288 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:36 crc kubenswrapper[4926]: I0312 18:05:36.508139 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 12 18:05:38 crc kubenswrapper[4926]: I0312 18:05:38.489763 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:38 crc kubenswrapper[4926]: I0312 18:05:38.489887 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:38 crc kubenswrapper[4926]: I0312 18:05:38.489967 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:38 crc kubenswrapper[4926]: I0312 18:05:38.490233 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:38 crc kubenswrapper[4926]: E0312 18:05:38.490246 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:38 crc kubenswrapper[4926]: E0312 18:05:38.490385 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:38 crc kubenswrapper[4926]: E0312 18:05:38.490497 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:38 crc kubenswrapper[4926]: E0312 18:05:38.490624 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.488915 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.489067 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.489130 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.489247 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.489328 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.489639 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.489748 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.489515 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.513535 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c31076f-c495-4763-9ea0-58712f07bb6c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://405f59d0da6c9a3663ed746f08f9d5c2d94818971dbc0ce0373690c731b5afae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ede7f52bcdc49e61c1fd76151da86db1e67d189c2fe147a60e207d4aa2dbfed8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0312 18:03:14.139597 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0312 18:03:14.141034 1 observer_polling.go:159] Starting file observer\\\\nI0312 18:03:14.142357 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0312 18:03:14.143332 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0312 18:03:43.393935 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0312 18:03:44.289189 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0312 18:03:44.289250 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:13Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:03:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f33c0bfdb670b43186efb3e52df85915bd35749a127245356f71fe96994d85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8ff9f1fe3b91c7273624abf9e138c54d1d2228edc8e5ff370cdcc3b8df4a7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.524584 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4gmrt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcfdbe34-faf7-4306-a2d8-6e95715f4f2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aec75b239d7939d135db129ca6960660469ae8a708cfb841456f865c900fdfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fmvtd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4gmrt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.535995 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"467073c1-7776-4dca-9a24-1beb51b5775f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a00032a5db7175b95edd80be3a15bca3e5bfee1c8bcc8bb2353ab3b620e12b08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fc024e5f387159c7533f0ba92814dda334990d955a31d8387cc943e31d4f6a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.553346 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.573552 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6084f41c213b6a4cdd05e3347e853818264c55f770f84083635e32dde284a489\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.593653 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.597289 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.608875 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7b34559-da2f-4796-8f3f-c56b2725c464\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a6ddc83150ed274605a4162acaba9d7326df9cb34f2197c177597af1cfc9d19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8t62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hmdg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.619551 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f369f51b-80f9-46fb-b43d-d6e057d3ebf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:03:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16541dcf0547cc5940cc3d4d8b3007cd56ffd6fcbf3f7a042b641afeed488a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d1c51341df9f02f1ffb0ac64cf07549f04ad8b349843cd11e756ca8d3be225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70f881f53148e0d424118d6c39bcb9b736ec796d6bca00ca4a67714186d8c57c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90ff4bf24bc46ebc21c4b6f77467c73f094cca582e806b41b740ce17119226f2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.638355 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"751cfb31-c4cb-4d6a-8439-c2d4a64ccfe5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4da583451dead089b4282bf5a7ae151758817a7761131745a238968a4a1d9f75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://587b28a1d4a204a36e5a2d1828a497d64dd9774cef86641c25dd276fe3185c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59e6a444813d1bdcdb28520b832548d594b7170869334f17d0e1cb3c5c4c3338\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://357166a9a18663c77154337ba5b95c2485ed2d008dab9caf790697c54e55da27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5b57f2b6ccf2e83dc3cdc2b924966715676cca996f717685dda64c8a1556525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93765ed57dbcebda4b710bf4cafac472d4a67f6d60b259c2314ad5169301c1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f93765ed57dbcebda4b710bf4cafac472d4a67f6d60b259c2314ad5169301c1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb489c2b03ee964d954776552ca7b84509be8dfcfff33050c034d54a141bc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fb489c2b03ee964d954776552ca7b84509be8dfcfff33050c034d54a141bc63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://63c194a342c3a7d6ac13482ec064af4b05918784b371b742dd4d9e15a3c05cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63c194a342c3a7d6ac13482ec064af4b05918784b371b742dd4d9e15a3c05cb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.651997 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.667232 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13bb5191fe71e7796ec5d778fdafbd01d59eb57b8cf02c5e588a1a2714f371f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db693cb7a535d2499f919ccbabcebc5b97c340fe0dae0a02dc03121b24c9ee7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.687189 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc33af41-5aa0-4254-ac75-69433d5f4ce9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:26Z\\\",\\\"message\\\":\\\"(nil)\\\\nI0312 18:05:26.683779 7541 services_controller.go:444] Built service openshift-dns/dns-default LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"UDP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:53, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.4.10\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9154, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0312 18:05:26.683797 7541 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:05:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4t5dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zlfmg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.701839 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-srh42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d37aa11-8fa5-4eb3-8edd-6f71523623b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eda521cfe2ce4a398e537a7f447b65616af12bce78d6fad3e4aafa2a34195eb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b10730077cd07faf61ad5167a5ee7ef1630bd4b48dd478080604f8f3bd511818\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73c4731a9a0bb1d7c53c7ffcdb7e6ed42840c6e2b4247cbff3aad6f2e7b39101\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://099d0eabc51c98a9d49bdf0ea3a18627e138b6a83d5769fd8d62467de933324e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bae207872238453e4fbb97dbed373dd96b22348248989c66074062988d0cda2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c69f3072bac6f67905ba48b8f894b0d682e66c58ea7749c96507be310bee09a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1fe92341e742b5099f178691183a9e139eb4f129c53ff3a7ff9c4a8731599ef\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:04:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbt59\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-srh42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.712114 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"211eeae6-9b41-484b-bd13-99c1c28cdf96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n6bnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n7pd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.723791 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aeb621bb-05ee-456b-b869-1cdd14184ad1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-12T18:03:48Z\\\",\\\"message\\\":\\\"W0312 18:03:47.732255 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0312 18:03:47.732907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773338627 cert, and key in /tmp/serving-cert-406653352/serving-signer.crt, /tmp/serving-cert-406653352/serving-signer.key\\\\nI0312 18:03:48.334309 1 observer_polling.go:159] Starting file observer\\\\nW0312 18:03:48.343915 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0312 18:03:48.344098 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0312 18:03:48.345204 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-406653352/tls.crt::/tmp/serving-cert-406653352/tls.key\\\\\\\"\\\\nF0312 18:03:48.873093 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:03:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T18:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T18:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.735840 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f8dcba7a43670fd5cf236b71365bc171373a7d2eb9bc24c01f5e8e49260a359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.749770 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwqvl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5a53ef4-c701-457f-9cf2-85819bf04d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-12T18:05:14Z\\\",\\\"message\\\":\\\"2026-03-12T18:04:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d\\\\n2026-03-12T18:04:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_38286ed1-c497-4a74-9174-f8362ccb899d to /host/opt/cni/bin/\\\\n2026-03-12T18:04:29Z [verbose] multus-daemon started\\\\n2026-03-12T18:04:29Z [verbose] Readiness Indicator file check\\\\n2026-03-12T18:05:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:05:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bt6hw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwqvl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.760675 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f9vxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"594c806d-dd79-41ce-8e3a-a33d42bf0f7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://690b46808ceaeee6e4769dad24c7bdb281b5530a27d5079e9ddf1e55f914171e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x7bdm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f9vxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.769062 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.769200 4926 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.769272 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs podName:211eeae6-9b41-484b-bd13-99c1c28cdf96 nodeName:}" failed. No retries permitted until 2026-03-12 18:06:44.769254151 +0000 UTC m=+245.137880574 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs") pod "network-metrics-daemon-n7pd7" (UID: "211eeae6-9b41-484b-bd13-99c1c28cdf96") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.771520 4926 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12de8a94-72e6-4d72-8e39-42f3ef9d1125\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T18:04:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1c4af6ac2ce0a767b4d67627bcab2f7617b3c7845fbcb3136f3cf8931dc186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7342dbad64c6f7552247607ae0c5da9b7490241867cb0a212d424e2b581910a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T18:04:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vwn2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T18:04:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fq9dc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.970753 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.970818 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.970834 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.970850 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.970862 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:40Z","lastTransitionTime":"2026-03-12T18:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:40 crc kubenswrapper[4926]: E0312 18:05:40.989570 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:40Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.993627 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.993672 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.993682 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.993699 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:40 crc kubenswrapper[4926]: I0312 18:05:40.993713 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:40Z","lastTransitionTime":"2026-03-12T18:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:41 crc kubenswrapper[4926]: E0312 18:05:41.007604 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.011588 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.011630 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.011641 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.011658 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.011674 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:41Z","lastTransitionTime":"2026-03-12T18:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:41 crc kubenswrapper[4926]: E0312 18:05:41.025180 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.029478 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.029529 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.029544 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.029564 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.029577 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:41Z","lastTransitionTime":"2026-03-12T18:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:41 crc kubenswrapper[4926]: E0312 18:05:41.046471 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.050648 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.050698 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.050707 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.050726 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:41 crc kubenswrapper[4926]: I0312 18:05:41.050736 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:41Z","lastTransitionTime":"2026-03-12T18:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:41 crc kubenswrapper[4926]: E0312 18:05:41.067109 4926 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T18:05:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2090c8b2-af81-407e-bc9b-78510eed61ed\\\",\\\"systemUUID\\\":\\\"9f4a0cfb-e2ee-40d1-a613-eac4618fc62c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T18:05:41Z is after 2025-08-24T17:21:41Z" Mar 12 18:05:41 crc kubenswrapper[4926]: E0312 18:05:41.067235 4926 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 18:05:42 crc kubenswrapper[4926]: I0312 18:05:42.488950 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:42 crc kubenswrapper[4926]: I0312 18:05:42.488994 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:42 crc kubenswrapper[4926]: E0312 18:05:42.489149 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:42 crc kubenswrapper[4926]: I0312 18:05:42.489183 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:42 crc kubenswrapper[4926]: E0312 18:05:42.489356 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:42 crc kubenswrapper[4926]: I0312 18:05:42.489489 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:42 crc kubenswrapper[4926]: E0312 18:05:42.489506 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:42 crc kubenswrapper[4926]: E0312 18:05:42.489640 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:43 crc kubenswrapper[4926]: I0312 18:05:43.490888 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:05:43 crc kubenswrapper[4926]: E0312 18:05:43.491334 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:05:44 crc kubenswrapper[4926]: I0312 18:05:44.489199 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:44 crc kubenswrapper[4926]: I0312 18:05:44.489278 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:44 crc kubenswrapper[4926]: I0312 18:05:44.489232 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:44 crc kubenswrapper[4926]: E0312 18:05:44.489376 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:44 crc kubenswrapper[4926]: E0312 18:05:44.489551 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:44 crc kubenswrapper[4926]: E0312 18:05:44.489624 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:44 crc kubenswrapper[4926]: I0312 18:05:44.489755 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:44 crc kubenswrapper[4926]: E0312 18:05:44.490079 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:45 crc kubenswrapper[4926]: E0312 18:05:45.595032 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:46 crc kubenswrapper[4926]: I0312 18:05:46.489951 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:46 crc kubenswrapper[4926]: I0312 18:05:46.490030 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:46 crc kubenswrapper[4926]: E0312 18:05:46.490122 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:46 crc kubenswrapper[4926]: I0312 18:05:46.490225 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:46 crc kubenswrapper[4926]: E0312 18:05:46.490269 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:46 crc kubenswrapper[4926]: I0312 18:05:46.490414 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:46 crc kubenswrapper[4926]: E0312 18:05:46.490517 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:46 crc kubenswrapper[4926]: E0312 18:05:46.490668 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:48 crc kubenswrapper[4926]: I0312 18:05:48.489171 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:48 crc kubenswrapper[4926]: I0312 18:05:48.489241 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:48 crc kubenswrapper[4926]: I0312 18:05:48.489257 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:48 crc kubenswrapper[4926]: E0312 18:05:48.489338 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:48 crc kubenswrapper[4926]: I0312 18:05:48.489491 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:48 crc kubenswrapper[4926]: E0312 18:05:48.489494 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:48 crc kubenswrapper[4926]: E0312 18:05:48.489602 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:48 crc kubenswrapper[4926]: E0312 18:05:48.489775 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.489336 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.489336 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:50 crc kubenswrapper[4926]: E0312 18:05:50.489472 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.489501 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.489423 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:50 crc kubenswrapper[4926]: E0312 18:05:50.489609 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:50 crc kubenswrapper[4926]: E0312 18:05:50.489693 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:50 crc kubenswrapper[4926]: E0312 18:05:50.489742 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.520931 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=23.520898785 podStartE2EDuration="23.520898785s" podCreationTimestamp="2026-03-12 18:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.520670229 +0000 UTC m=+190.889296562" watchObservedRunningTime="2026-03-12 18:05:50.520898785 +0000 UTC m=+190.889525158" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.552064 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4gmrt" podStartSLOduration=119.552031218 podStartE2EDuration="1m59.552031218s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.536214449 +0000 UTC m=+190.904840782" watchObservedRunningTime="2026-03-12 18:05:50.552031218 +0000 UTC m=+190.920657551" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.574347 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podStartSLOduration=119.574304818 podStartE2EDuration="1m59.574304818s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.562846428 +0000 UTC m=+190.931472771" watchObservedRunningTime="2026-03-12 18:05:50.574304818 +0000 UTC m=+190.942931151" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.575013 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.575005098 podStartE2EDuration="23.575005098s" podCreationTimestamp="2026-03-12 18:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.574247416 +0000 UTC m=+190.942873749" watchObservedRunningTime="2026-03-12 18:05:50.575005098 +0000 UTC m=+190.943631431" Mar 12 18:05:50 crc kubenswrapper[4926]: E0312 18:05:50.596533 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.680146 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-srh42" podStartSLOduration=119.680130502 podStartE2EDuration="1m59.680130502s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.678196625 +0000 UTC m=+191.046822958" watchObservedRunningTime="2026-03-12 18:05:50.680130502 +0000 UTC m=+191.048756835" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.710929 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=72.710906044 podStartE2EDuration="1m12.710906044s" podCreationTimestamp="2026-03-12 18:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.710606275 +0000 UTC m=+191.079232608" watchObservedRunningTime="2026-03-12 18:05:50.710906044 +0000 UTC m=+191.079532377" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.734521 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=14.734499652 podStartE2EDuration="14.734499652s" podCreationTimestamp="2026-03-12 18:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.734141443 +0000 UTC m=+191.102767776" watchObservedRunningTime="2026-03-12 18:05:50.734499652 +0000 UTC m=+191.103125995" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.769687 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f9vxh" podStartSLOduration=119.769663545 podStartE2EDuration="1m59.769663545s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.757259017 +0000 UTC m=+191.125885350" watchObservedRunningTime="2026-03-12 18:05:50.769663545 +0000 UTC m=+191.138289918" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.793680 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fq9dc" podStartSLOduration=119.793658895 podStartE2EDuration="1m59.793658895s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.769993915 +0000 UTC m=+191.138620248" watchObservedRunningTime="2026-03-12 18:05:50.793658895 +0000 UTC m=+191.162285248" Mar 12 18:05:50 crc kubenswrapper[4926]: I0312 18:05:50.794010 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.794004235 podStartE2EDuration="1m21.794004235s" podCreationTimestamp="2026-03-12 18:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.792706357 +0000 UTC m=+191.161332700" watchObservedRunningTime="2026-03-12 18:05:50.794004235 +0000 UTC m=+191.162630578" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.216233 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.216583 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.216727 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.216864 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.216992 4926 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T18:05:51Z","lastTransitionTime":"2026-03-12T18:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.270724 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xwqvl" podStartSLOduration=120.270694997 podStartE2EDuration="2m0.270694997s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:50.876945273 +0000 UTC m=+191.245571596" watchObservedRunningTime="2026-03-12 18:05:51.270694997 +0000 UTC m=+191.639321370" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.272275 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh"] Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.273085 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.276635 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.276922 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.277107 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.277333 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.285051 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.285110 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.285265 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.285295 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.285367 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.386230 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.387062 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.387128 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.387151 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.387174 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.387416 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.387520 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.388137 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.396537 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.407754 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a27d1b4e-7996-4cbe-a080-c3fbc9f865ef-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s7lzh\" (UID: \"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.543928 4926 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.551523 4926 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 18:05:51 crc kubenswrapper[4926]: I0312 18:05:51.585314 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.349889 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" event={"ID":"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef","Type":"ContainerStarted","Data":"07414689799ff8b48532f3abea063045d7db46c7a9af2329b1e92826e7d27a8a"} Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.350200 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" event={"ID":"a27d1b4e-7996-4cbe-a080-c3fbc9f865ef","Type":"ContainerStarted","Data":"7a9c9a091fc5f64dc99b67261f0cdb0d61c11b9d3fcb0cc7436de459525e3eda"} Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.367735 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s7lzh" podStartSLOduration=121.367714334 podStartE2EDuration="2m1.367714334s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:05:52.367095126 +0000 UTC m=+192.735721459" watchObservedRunningTime="2026-03-12 18:05:52.367714334 +0000 UTC m=+192.736340667" Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.489616 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.489657 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.489696 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:52 crc kubenswrapper[4926]: I0312 18:05:52.489712 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:52 crc kubenswrapper[4926]: E0312 18:05:52.490180 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:52 crc kubenswrapper[4926]: E0312 18:05:52.490320 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:52 crc kubenswrapper[4926]: E0312 18:05:52.490356 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:52 crc kubenswrapper[4926]: E0312 18:05:52.490481 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:54 crc kubenswrapper[4926]: I0312 18:05:54.489020 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:54 crc kubenswrapper[4926]: E0312 18:05:54.489238 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:54 crc kubenswrapper[4926]: I0312 18:05:54.489279 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:54 crc kubenswrapper[4926]: I0312 18:05:54.489292 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:54 crc kubenswrapper[4926]: E0312 18:05:54.489646 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:54 crc kubenswrapper[4926]: I0312 18:05:54.489953 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:54 crc kubenswrapper[4926]: E0312 18:05:54.490085 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:54 crc kubenswrapper[4926]: E0312 18:05:54.490995 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:54 crc kubenswrapper[4926]: I0312 18:05:54.491431 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:05:54 crc kubenswrapper[4926]: E0312 18:05:54.491687 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zlfmg_openshift-ovn-kubernetes(bc33af41-5aa0-4254-ac75-69433d5f4ce9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" Mar 12 18:05:55 crc kubenswrapper[4926]: E0312 18:05:55.598563 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:05:56 crc kubenswrapper[4926]: I0312 18:05:56.489992 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:56 crc kubenswrapper[4926]: I0312 18:05:56.490060 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:56 crc kubenswrapper[4926]: I0312 18:05:56.489993 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:56 crc kubenswrapper[4926]: E0312 18:05:56.490138 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:05:56 crc kubenswrapper[4926]: I0312 18:05:56.490207 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:56 crc kubenswrapper[4926]: E0312 18:05:56.490271 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:56 crc kubenswrapper[4926]: E0312 18:05:56.490369 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:56 crc kubenswrapper[4926]: E0312 18:05:56.490423 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:58 crc kubenswrapper[4926]: I0312 18:05:58.489088 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:05:58 crc kubenswrapper[4926]: E0312 18:05:58.490180 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:05:58 crc kubenswrapper[4926]: I0312 18:05:58.489281 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:05:58 crc kubenswrapper[4926]: E0312 18:05:58.490380 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:05:58 crc kubenswrapper[4926]: I0312 18:05:58.489247 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:05:58 crc kubenswrapper[4926]: I0312 18:05:58.489316 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:05:58 crc kubenswrapper[4926]: E0312 18:05:58.490618 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:05:58 crc kubenswrapper[4926]: E0312 18:05:58.490842 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.378230 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/1.log" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.379108 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/0.log" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.379165 4926 generic.go:334] "Generic (PLEG): container finished" podID="d5a53ef4-c701-457f-9cf2-85819bf04d1a" containerID="bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4" exitCode=1 Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.379198 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerDied","Data":"bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4"} Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.379231 4926 scope.go:117] "RemoveContainer" containerID="54fa97a4cd767400fe757a9f362d3aa29c51699ebd5671c32a04d7fbad9d6c6c" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.379552 4926 scope.go:117] "RemoveContainer" containerID="bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4" Mar 12 18:06:00 crc kubenswrapper[4926]: E0312 18:06:00.379706 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xwqvl_openshift-multus(d5a53ef4-c701-457f-9cf2-85819bf04d1a)\"" pod="openshift-multus/multus-xwqvl" podUID="d5a53ef4-c701-457f-9cf2-85819bf04d1a" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.489064 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.489106 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.489526 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:00 crc kubenswrapper[4926]: E0312 18:06:00.490465 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:00 crc kubenswrapper[4926]: I0312 18:06:00.490531 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:00 crc kubenswrapper[4926]: E0312 18:06:00.490691 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:00 crc kubenswrapper[4926]: E0312 18:06:00.490784 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:00 crc kubenswrapper[4926]: E0312 18:06:00.490997 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:00 crc kubenswrapper[4926]: E0312 18:06:00.600043 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:06:01 crc kubenswrapper[4926]: I0312 18:06:01.383990 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/1.log" Mar 12 18:06:02 crc kubenswrapper[4926]: I0312 18:06:02.489387 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:02 crc kubenswrapper[4926]: E0312 18:06:02.489596 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:02 crc kubenswrapper[4926]: I0312 18:06:02.489863 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:02 crc kubenswrapper[4926]: E0312 18:06:02.489978 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:02 crc kubenswrapper[4926]: I0312 18:06:02.490193 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:02 crc kubenswrapper[4926]: E0312 18:06:02.490299 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:02 crc kubenswrapper[4926]: I0312 18:06:02.490605 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:02 crc kubenswrapper[4926]: E0312 18:06:02.490747 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:04 crc kubenswrapper[4926]: I0312 18:06:04.489282 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:04 crc kubenswrapper[4926]: I0312 18:06:04.489364 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:04 crc kubenswrapper[4926]: I0312 18:06:04.489282 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:04 crc kubenswrapper[4926]: E0312 18:06:04.489553 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:04 crc kubenswrapper[4926]: E0312 18:06:04.489681 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:04 crc kubenswrapper[4926]: E0312 18:06:04.489925 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:04 crc kubenswrapper[4926]: I0312 18:06:04.490678 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:04 crc kubenswrapper[4926]: E0312 18:06:04.490792 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:05 crc kubenswrapper[4926]: E0312 18:06:05.601585 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:06:06 crc kubenswrapper[4926]: I0312 18:06:06.489891 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:06 crc kubenswrapper[4926]: I0312 18:06:06.489919 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:06 crc kubenswrapper[4926]: I0312 18:06:06.489974 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:06 crc kubenswrapper[4926]: E0312 18:06:06.490728 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:06 crc kubenswrapper[4926]: E0312 18:06:06.490825 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:06 crc kubenswrapper[4926]: I0312 18:06:06.489993 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:06 crc kubenswrapper[4926]: E0312 18:06:06.490982 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:06 crc kubenswrapper[4926]: E0312 18:06:06.491284 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:07 crc kubenswrapper[4926]: I0312 18:06:07.490137 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.407279 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/3.log" Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.409582 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerStarted","Data":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.410009 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.440594 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n7pd7"] Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.440691 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:08 crc kubenswrapper[4926]: E0312 18:06:08.440762 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.489639 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.489751 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:08 crc kubenswrapper[4926]: I0312 18:06:08.489781 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:08 crc kubenswrapper[4926]: E0312 18:06:08.490095 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:08 crc kubenswrapper[4926]: E0312 18:06:08.489940 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:08 crc kubenswrapper[4926]: E0312 18:06:08.490200 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:10 crc kubenswrapper[4926]: I0312 18:06:10.488927 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:10 crc kubenswrapper[4926]: I0312 18:06:10.490984 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:10 crc kubenswrapper[4926]: I0312 18:06:10.491011 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:10 crc kubenswrapper[4926]: I0312 18:06:10.491050 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:10 crc kubenswrapper[4926]: E0312 18:06:10.491131 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:10 crc kubenswrapper[4926]: E0312 18:06:10.491230 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:10 crc kubenswrapper[4926]: E0312 18:06:10.491312 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:10 crc kubenswrapper[4926]: E0312 18:06:10.491815 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:10 crc kubenswrapper[4926]: E0312 18:06:10.603029 4926 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:06:12 crc kubenswrapper[4926]: I0312 18:06:12.489838 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:12 crc kubenswrapper[4926]: I0312 18:06:12.489929 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:12 crc kubenswrapper[4926]: E0312 18:06:12.490054 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:12 crc kubenswrapper[4926]: I0312 18:06:12.490083 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:12 crc kubenswrapper[4926]: E0312 18:06:12.490226 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:12 crc kubenswrapper[4926]: I0312 18:06:12.490294 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:12 crc kubenswrapper[4926]: E0312 18:06:12.490536 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:12 crc kubenswrapper[4926]: E0312 18:06:12.491864 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:12 crc kubenswrapper[4926]: I0312 18:06:12.492480 4926 scope.go:117] "RemoveContainer" containerID="bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4" Mar 12 18:06:12 crc kubenswrapper[4926]: I0312 18:06:12.515220 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podStartSLOduration=141.515197578 podStartE2EDuration="2m21.515197578s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:08.44646175 +0000 UTC m=+208.815088103" watchObservedRunningTime="2026-03-12 18:06:12.515197578 +0000 UTC m=+212.883823951" Mar 12 18:06:13 crc kubenswrapper[4926]: I0312 18:06:13.431768 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/1.log" Mar 12 18:06:13 crc kubenswrapper[4926]: I0312 18:06:13.432130 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerStarted","Data":"2c510c831017eeb7aed88601040d4790cf6d5bebce7b09f246c92ea9b81e2481"} Mar 12 18:06:14 crc kubenswrapper[4926]: I0312 18:06:14.489906 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:14 crc kubenswrapper[4926]: I0312 18:06:14.489949 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:14 crc kubenswrapper[4926]: I0312 18:06:14.490005 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:14 crc kubenswrapper[4926]: E0312 18:06:14.490038 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 18:06:14 crc kubenswrapper[4926]: I0312 18:06:14.490059 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:14 crc kubenswrapper[4926]: E0312 18:06:14.490179 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n7pd7" podUID="211eeae6-9b41-484b-bd13-99c1c28cdf96" Mar 12 18:06:14 crc kubenswrapper[4926]: E0312 18:06:14.490220 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 18:06:14 crc kubenswrapper[4926]: E0312 18:06:14.490272 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.489068 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.490077 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.490309 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.490376 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.493045 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.493312 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.494754 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.494846 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.495833 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 18:06:16 crc kubenswrapper[4926]: I0312 18:06:16.496077 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.232606 4926 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.337764 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.338379 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.338580 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.338726 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qs58t"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.339047 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.339197 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-vb9qx"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.339474 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.339807 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.340273 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bfn44"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.340732 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.342175 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z95pp"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.342956 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.349560 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350101 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350269 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350281 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350484 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350621 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350658 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350845 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351031 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351078 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.350629 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351197 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351268 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351399 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351713 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351817 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.351851 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.352029 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.353788 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.354331 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.354879 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.354893 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.355220 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.360902 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.361378 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.361535 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.361543 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.361711 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.361885 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.363312 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.363536 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.363684 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.363826 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.363934 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.364030 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.370922 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.371129 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.371622 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.371791 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.371796 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.372487 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.374980 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s9tvm"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.375825 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377069 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377156 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377322 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377399 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377621 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377715 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377909 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.377954 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.378096 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.378279 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.378511 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.378699 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.378803 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.378904 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.379007 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.380726 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7rmsn"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.383583 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.395962 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.401604 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.402662 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.403254 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.407930 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.410047 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.410634 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.411349 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wsbc"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.412227 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.412644 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.412870 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-st67x"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.412889 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413165 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413272 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj7xf\" (UniqueName: \"kubernetes.io/projected/7e94e6e0-b16d-462f-b791-ba20acdcb809-kube-api-access-fj7xf\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413301 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c30a7ad-f92b-445a-9201-fe55f247cf41-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413332 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-config\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413348 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkcw\" (UniqueName: \"kubernetes.io/projected/48a0fa25-6b2d-4668-b8e5-824912077f19-kube-api-access-snkcw\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413386 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5e304-df7c-434b-8b17-f520e9bb7d52-config\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413402 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/49e5e304-df7c-434b-8b17-f520e9bb7d52-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413417 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-serving-cert\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413433 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413469 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-serving-cert\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413485 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413509 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28x4l\" (UniqueName: \"kubernetes.io/projected/4230f869-9456-44a1-87b3-342fc8c18ed7-kube-api-access-28x4l\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413522 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-audit\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413537 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-encryption-config\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413554 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w622m\" (UniqueName: \"kubernetes.io/projected/4c2f00a8-c3ce-4957-9093-d8c2cce49992-kube-api-access-w622m\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413569 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e94e6e0-b16d-462f-b791-ba20acdcb809-audit-dir\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413585 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-config\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413606 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413623 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48a0fa25-6b2d-4668-b8e5-824912077f19-audit-dir\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413691 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-config\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413707 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46vq\" (UniqueName: \"kubernetes.io/projected/270031fa-3d83-4edf-bb5d-19ce9e1a693d-kube-api-access-t46vq\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413768 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bh26\" (UniqueName: \"kubernetes.io/projected/49e5e304-df7c-434b-8b17-f520e9bb7d52-kube-api-access-6bh26\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413787 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-serving-cert\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413813 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413833 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-policies\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413849 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-serving-cert\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413870 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-dir\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413884 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413918 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-service-ca\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413937 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4230f869-9456-44a1-87b3-342fc8c18ed7-serving-cert\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413952 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b6dd11-c219-4b24-90eb-dbc096a67835-config\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413972 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-oauth-config\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-config\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.413993 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414011 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414027 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414045 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-client\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414074 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414105 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-trusted-ca-bundle\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414123 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2f00a8-c3ce-4957-9093-d8c2cce49992-serving-cert\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414138 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-etcd-client\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414190 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/49e5e304-df7c-434b-8b17-f520e9bb7d52-images\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414211 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414232 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqbzk\" (UniqueName: \"kubernetes.io/projected/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-kube-api-access-hqbzk\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414252 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-config\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414269 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88hg4\" (UniqueName: \"kubernetes.io/projected/eedd886d-5443-47e1-afbf-5aff90067f3b-kube-api-access-88hg4\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414284 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9zg78"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414293 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48a0fa25-6b2d-4668-b8e5-824912077f19-node-pullsecrets\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414313 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62f9n\" (UniqueName: \"kubernetes.io/projected/5db62dca-ba86-4ca4-861e-003d09e5ac0f-kube-api-access-62f9n\") pod \"cluster-samples-operator-665b6dd947-hwwdz\" (UID: \"5db62dca-ba86-4ca4-861e-003d09e5ac0f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414342 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414359 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414408 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db62dca-ba86-4ca4-861e-003d09e5ac0f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hwwdz\" (UID: \"5db62dca-ba86-4ca4-861e-003d09e5ac0f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414429 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414473 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-image-import-ca\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414492 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-service-ca\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414521 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-etcd-client\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414539 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414556 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgdh\" (UniqueName: \"kubernetes.io/projected/55b6dd11-c219-4b24-90eb-dbc096a67835-kube-api-access-jbgdh\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414578 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414601 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c30a7ad-f92b-445a-9201-fe55f247cf41-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414620 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-oauth-serving-cert\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414651 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-audit-policies\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414669 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7dxn\" (UniqueName: \"kubernetes.io/projected/8c30a7ad-f92b-445a-9201-fe55f247cf41-kube-api-access-c7dxn\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414727 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-service-ca-bundle\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414774 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/55b6dd11-c219-4b24-90eb-dbc096a67835-machine-approver-tls\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414791 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414812 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-etcd-serving-ca\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414851 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414886 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-ca\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414888 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414928 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55b6dd11-c219-4b24-90eb-dbc096a67835-auth-proxy-config\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.414981 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-client-ca\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.415015 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ctp\" (UniqueName: \"kubernetes.io/projected/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-kube-api-access-n5ctp\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.415035 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-encryption-config\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.415414 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h95sl"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.415965 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.417164 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q49l5"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.420040 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.420291 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.420510 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.420688 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.422669 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.422848 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.423147 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.424209 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.424721 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.424755 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.424918 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.424972 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.425092 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.425123 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.425225 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.425303 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.425425 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.425869 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6fzt"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.426308 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.426826 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.427163 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bsvxw"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.427646 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.427875 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.427969 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428111 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428342 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428492 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428621 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428879 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428907 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.428949 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.429072 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.429432 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.429077 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.431425 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.432056 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.432182 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.432476 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433118 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433350 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433414 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433561 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433696 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433727 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433851 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.433981 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.434065 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.434186 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.434219 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.434417 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.461923 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wndvq"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.462677 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.466724 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.468030 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.468518 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.469246 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.469316 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.469246 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.469695 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.469713 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.469923 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.470431 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.470804 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.471713 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.486226 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.486309 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.486881 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.487581 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.488079 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.488927 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c68kr"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.505811 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.508313 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.508911 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.509579 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.511266 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.511505 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.511786 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.514580 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.515620 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.515724 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517062 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-serving-cert\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517107 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bh26\" (UniqueName: \"kubernetes.io/projected/49e5e304-df7c-434b-8b17-f520e9bb7d52-kube-api-access-6bh26\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517137 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrhq\" (UniqueName: \"kubernetes.io/projected/1dc82997-2782-4e9e-a293-956fcb96acde-kube-api-access-xfrhq\") pod \"dns-operator-744455d44c-9zg78\" (UID: \"1dc82997-2782-4e9e-a293-956fcb96acde\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517156 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517176 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-policies\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517192 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-service-ca\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517209 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-serving-cert\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517226 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-dir\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517241 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517259 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1dc82997-2782-4e9e-a293-956fcb96acde-metrics-tls\") pod \"dns-operator-744455d44c-9zg78\" (UID: \"1dc82997-2782-4e9e-a293-956fcb96acde\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517279 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4230f869-9456-44a1-87b3-342fc8c18ed7-serving-cert\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517296 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b6dd11-c219-4b24-90eb-dbc096a67835-config\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517312 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-oauth-config\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517329 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee39302-0316-4481-871e-538ffd31a507-trusted-ca\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517346 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517363 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517379 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-config\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517396 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517457 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2f00a8-c3ce-4957-9093-d8c2cce49992-serving-cert\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517478 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-client\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517493 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517509 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-trusted-ca-bundle\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517525 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqbzk\" (UniqueName: \"kubernetes.io/projected/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-kube-api-access-hqbzk\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517543 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-etcd-client\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517558 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct49b\" (UniqueName: \"kubernetes.io/projected/e379fe1d-7780-4f17-8df8-f74f3dddbc23-kube-api-access-ct49b\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517575 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/49e5e304-df7c-434b-8b17-f520e9bb7d52-images\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517591 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517607 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-config\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517625 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-stats-auth\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517644 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-config\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517660 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88hg4\" (UniqueName: \"kubernetes.io/projected/eedd886d-5443-47e1-afbf-5aff90067f3b-kube-api-access-88hg4\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517679 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62f9n\" (UniqueName: \"kubernetes.io/projected/5db62dca-ba86-4ca4-861e-003d09e5ac0f-kube-api-access-62f9n\") pod \"cluster-samples-operator-665b6dd947-hwwdz\" (UID: \"5db62dca-ba86-4ca4-861e-003d09e5ac0f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517695 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48a0fa25-6b2d-4668-b8e5-824912077f19-node-pullsecrets\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517713 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517730 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517751 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517771 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517788 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee39302-0316-4481-871e-538ffd31a507-serving-cert\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517817 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db62dca-ba86-4ca4-861e-003d09e5ac0f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hwwdz\" (UID: \"5db62dca-ba86-4ca4-861e-003d09e5ac0f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517832 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-service-ca\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517849 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-image-import-ca\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517871 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-etcd-client\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517896 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517917 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c30a7ad-f92b-445a-9201-fe55f247cf41-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517932 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbgdh\" (UniqueName: \"kubernetes.io/projected/55b6dd11-c219-4b24-90eb-dbc096a67835-kube-api-access-jbgdh\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517948 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517965 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7dxn\" (UniqueName: \"kubernetes.io/projected/8c30a7ad-f92b-445a-9201-fe55f247cf41-kube-api-access-c7dxn\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517979 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-oauth-serving-cert\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.517996 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjftj\" (UniqueName: \"kubernetes.io/projected/7ee39302-0316-4481-871e-538ffd31a507-kube-api-access-tjftj\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518011 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-default-certificate\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518027 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-audit-policies\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518070 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e379fe1d-7780-4f17-8df8-f74f3dddbc23-service-ca-bundle\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518094 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-service-ca-bundle\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518118 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/55b6dd11-c219-4b24-90eb-dbc096a67835-machine-approver-tls\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518153 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-ca\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518168 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518182 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-etcd-serving-ca\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518198 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518215 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55b6dd11-c219-4b24-90eb-dbc096a67835-auth-proxy-config\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518237 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-client-ca\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518257 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-encryption-config\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518276 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee39302-0316-4481-871e-538ffd31a507-config\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518297 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ctp\" (UniqueName: \"kubernetes.io/projected/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-kube-api-access-n5ctp\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518321 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518342 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj7xf\" (UniqueName: \"kubernetes.io/projected/7e94e6e0-b16d-462f-b791-ba20acdcb809-kube-api-access-fj7xf\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518363 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c30a7ad-f92b-445a-9201-fe55f247cf41-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518381 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-config\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518398 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snkcw\" (UniqueName: \"kubernetes.io/projected/48a0fa25-6b2d-4668-b8e5-824912077f19-kube-api-access-snkcw\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518415 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5e304-df7c-434b-8b17-f520e9bb7d52-config\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518431 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/49e5e304-df7c-434b-8b17-f520e9bb7d52-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518465 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-serving-cert\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518486 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518503 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-serving-cert\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518521 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518538 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518557 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28x4l\" (UniqueName: \"kubernetes.io/projected/4230f869-9456-44a1-87b3-342fc8c18ed7-kube-api-access-28x4l\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518572 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-audit\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518587 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-encryption-config\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518603 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w622m\" (UniqueName: \"kubernetes.io/projected/4c2f00a8-c3ce-4957-9093-d8c2cce49992-kube-api-access-w622m\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518620 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e94e6e0-b16d-462f-b791-ba20acdcb809-audit-dir\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518636 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-config\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518652 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48a0fa25-6b2d-4668-b8e5-824912077f19-audit-dir\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518676 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-config\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518691 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-metrics-certs\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.518708 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46vq\" (UniqueName: \"kubernetes.io/projected/270031fa-3d83-4edf-bb5d-19ce9e1a693d-kube-api-access-t46vq\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.519516 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/49e5e304-df7c-434b-8b17-f520e9bb7d52-images\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.519927 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-dir\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.521739 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.532176 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-ca\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.533198 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-config\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.533653 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-config\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.533863 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48a0fa25-6b2d-4668-b8e5-824912077f19-node-pullsecrets\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.534860 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-etcd-serving-ca\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.535874 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.536406 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-audit\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.536770 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55b6dd11-c219-4b24-90eb-dbc096a67835-auth-proxy-config\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.547862 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.550004 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-policies\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.550083 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-trusted-ca-bundle\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.550216 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-client-ca\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.550482 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-service-ca\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.550657 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-oauth-config\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.551172 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/49e5e304-df7c-434b-8b17-f520e9bb7d52-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.551598 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-config\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.551810 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-config\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.551961 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e94e6e0-b16d-462f-b791-ba20acdcb809-audit-dir\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.552483 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-service-ca\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.552642 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.552879 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.559857 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.560015 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.560868 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.561062 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.561096 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.561794 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c30a7ad-f92b-445a-9201-fe55f247cf41-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.562193 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-config\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.562263 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7e94e6e0-b16d-462f-b791-ba20acdcb809-audit-policies\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.562836 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5e304-df7c-434b-8b17-f520e9bb7d52-config\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.563223 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-etcd-client\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.564814 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-oauth-serving-cert\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.565313 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-config\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.565360 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48a0fa25-6b2d-4668-b8e5-824912077f19-audit-dir\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.565787 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55b6dd11-c219-4b24-90eb-dbc096a67835-config\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.566062 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.566365 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-serving-cert\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.566564 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.567709 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/48a0fa25-6b2d-4668-b8e5-824912077f19-encryption-config\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.567908 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/48a0fa25-6b2d-4668-b8e5-824912077f19-image-import-ca\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.567994 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.568019 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-etcd-client\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.568315 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2f00a8-c3ce-4957-9093-d8c2cce49992-serving-cert\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.568616 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.568888 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.569365 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-serving-cert\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.569489 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e94e6e0-b16d-462f-b791-ba20acdcb809-encryption-config\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.569688 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.570066 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.570116 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.570208 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rx7tc"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.570888 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.571084 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.571422 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555646-wqpkb"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.571768 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.572468 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.572500 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.572515 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4230f869-9456-44a1-87b3-342fc8c18ed7-serving-cert\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.572714 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.573269 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qs58t"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.573305 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.573367 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.573519 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574138 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-service-ca-bundle\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574486 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bfn44"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574508 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574520 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574531 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z95pp"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574544 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.574991 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-serving-cert\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.575138 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.575190 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.575234 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rxml7"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.575268 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.575426 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.576330 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c9xl5"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.576626 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.585480 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.585519 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bsvxw"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.585531 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.589750 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.589867 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c30a7ad-f92b-445a-9201-fe55f247cf41-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.590304 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4c2f00a8-c3ce-4957-9093-d8c2cce49992-etcd-client\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.590397 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/55b6dd11-c219-4b24-90eb-dbc096a67835-machine-approver-tls\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.590792 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.592286 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.593515 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.595333 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9zg78"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.599371 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wsbc"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.599984 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-serving-cert\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.600609 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.604478 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db62dca-ba86-4ca4-861e-003d09e5ac0f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hwwdz\" (UID: \"5db62dca-ba86-4ca4-861e-003d09e5ac0f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.606320 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.610169 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.611392 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-st67x"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.620279 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.621632 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q49l5"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.622372 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee39302-0316-4481-871e-538ffd31a507-config\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623071 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-metrics-certs\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623198 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrhq\" (UniqueName: \"kubernetes.io/projected/1dc82997-2782-4e9e-a293-956fcb96acde-kube-api-access-xfrhq\") pod \"dns-operator-744455d44c-9zg78\" (UID: \"1dc82997-2782-4e9e-a293-956fcb96acde\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623242 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee39302-0316-4481-871e-538ffd31a507-trusted-ca\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623266 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1dc82997-2782-4e9e-a293-956fcb96acde-metrics-tls\") pod \"dns-operator-744455d44c-9zg78\" (UID: \"1dc82997-2782-4e9e-a293-956fcb96acde\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623297 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623344 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct49b\" (UniqueName: \"kubernetes.io/projected/e379fe1d-7780-4f17-8df8-f74f3dddbc23-kube-api-access-ct49b\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623374 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-config\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623395 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-stats-auth\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623736 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623835 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee39302-0316-4481-871e-538ffd31a507-serving-cert\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623849 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ee39302-0316-4481-871e-538ffd31a507-config\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623946 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjftj\" (UniqueName: \"kubernetes.io/projected/7ee39302-0316-4481-871e-538ffd31a507-kube-api-access-tjftj\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.623975 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-default-certificate\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.624035 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e379fe1d-7780-4f17-8df8-f74f3dddbc23-service-ca-bundle\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.626505 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ee39302-0316-4481-871e-538ffd31a507-trusted-ca\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.626628 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.628245 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1dc82997-2782-4e9e-a293-956fcb96acde-metrics-tls\") pod \"dns-operator-744455d44c-9zg78\" (UID: \"1dc82997-2782-4e9e-a293-956fcb96acde\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.628811 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.633203 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s9tvm"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.633863 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ee39302-0316-4481-871e-538ffd31a507-serving-cert\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.635175 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7rmsn"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.638873 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vb9qx"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.640935 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.641707 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c68kr"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.644147 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rx7tc"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.645713 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.648130 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.650103 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h95sl"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.651168 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.652457 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.653938 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.654837 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.656390 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rxml7"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.657421 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.659595 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6fzt"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.660664 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.660636 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.661692 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.662696 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-drmq2"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.663917 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.664078 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pd4lm"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.665003 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.665017 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555646-wqpkb"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.666065 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pd4lm"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.667416 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c9xl5"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.668798 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f"] Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.687117 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.701235 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.720702 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.741808 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.769817 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.780904 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.801509 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.821308 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.841009 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.860358 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.880741 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.900580 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.921049 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.940547 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.961217 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 18:06:22 crc kubenswrapper[4926]: I0312 18:06:22.982091 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.001643 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.021187 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.041657 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.061158 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.081088 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.101292 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.120989 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.141374 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.148345 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.161706 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.167158 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-config\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.201912 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.208208 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-metrics-certs\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.221240 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.230158 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-default-certificate\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.241758 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.250732 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e379fe1d-7780-4f17-8df8-f74f3dddbc23-stats-auth\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.261040 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.281628 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.301214 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.307106 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e379fe1d-7780-4f17-8df8-f74f3dddbc23-service-ca-bundle\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.322584 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.341233 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.361183 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.381785 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.454930 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.457614 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.458568 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.460775 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.481152 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.499859 4926 request.go:700] Waited for 1.011481135s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serving-cert&limit=500&resourceVersion=0 Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.501810 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.521119 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.549375 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.561254 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.580683 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.600141 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.620814 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.641384 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.660879 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.682005 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.701755 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.721590 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.742125 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.761953 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.801333 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.846777 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bh26\" (UniqueName: \"kubernetes.io/projected/49e5e304-df7c-434b-8b17-f520e9bb7d52-kube-api-access-6bh26\") pod \"machine-api-operator-5694c8668f-bfn44\" (UID: \"49e5e304-df7c-434b-8b17-f520e9bb7d52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.861667 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46vq\" (UniqueName: \"kubernetes.io/projected/270031fa-3d83-4edf-bb5d-19ce9e1a693d-kube-api-access-t46vq\") pod \"console-f9d7485db-vb9qx\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.895586 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88hg4\" (UniqueName: \"kubernetes.io/projected/eedd886d-5443-47e1-afbf-5aff90067f3b-kube-api-access-88hg4\") pod \"oauth-openshift-558db77b4-7rmsn\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.908024 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62f9n\" (UniqueName: \"kubernetes.io/projected/5db62dca-ba86-4ca4-861e-003d09e5ac0f-kube-api-access-62f9n\") pod \"cluster-samples-operator-665b6dd947-hwwdz\" (UID: \"5db62dca-ba86-4ca4-861e-003d09e5ac0f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.919050 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqbzk\" (UniqueName: \"kubernetes.io/projected/64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a-kube-api-access-hqbzk\") pod \"authentication-operator-69f744f599-qs58t\" (UID: \"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.936334 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbgdh\" (UniqueName: \"kubernetes.io/projected/55b6dd11-c219-4b24-90eb-dbc096a67835-kube-api-access-jbgdh\") pod \"machine-approver-56656f9798-qbxxz\" (UID: \"55b6dd11-c219-4b24-90eb-dbc096a67835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.955004 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w622m\" (UniqueName: \"kubernetes.io/projected/4c2f00a8-c3ce-4957-9093-d8c2cce49992-kube-api-access-w622m\") pod \"etcd-operator-b45778765-2wsbc\" (UID: \"4c2f00a8-c3ce-4957-9093-d8c2cce49992\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.976542 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ctp\" (UniqueName: \"kubernetes.io/projected/a5e9834d-5aeb-4154-ae70-c2b6b07c9eca-kube-api-access-n5ctp\") pod \"openshift-apiserver-operator-796bbdcf4f-q2smg\" (UID: \"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.994633 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7dxn\" (UniqueName: \"kubernetes.io/projected/8c30a7ad-f92b-445a-9201-fe55f247cf41-kube-api-access-c7dxn\") pod \"openshift-controller-manager-operator-756b6f6bc6-x7fm2\" (UID: \"8c30a7ad-f92b-445a-9201-fe55f247cf41\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:23 crc kubenswrapper[4926]: I0312 18:06:23.998048 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.018528 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.019125 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj7xf\" (UniqueName: \"kubernetes.io/projected/7e94e6e0-b16d-462f-b791-ba20acdcb809-kube-api-access-fj7xf\") pod \"apiserver-7bbb656c7d-9bzkt\" (UID: \"7e94e6e0-b16d-462f-b791-ba20acdcb809\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.049223 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkcw\" (UniqueName: \"kubernetes.io/projected/48a0fa25-6b2d-4668-b8e5-824912077f19-kube-api-access-snkcw\") pod \"apiserver-76f77b778f-z95pp\" (UID: \"48a0fa25-6b2d-4668-b8e5-824912077f19\") " pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.062550 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.063885 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28x4l\" (UniqueName: \"kubernetes.io/projected/4230f869-9456-44a1-87b3-342fc8c18ed7-kube-api-access-28x4l\") pod \"route-controller-manager-6576b87f9c-svrln\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.081990 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.101414 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.118684 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.121503 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.124686 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.127375 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.141767 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.141832 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.147802 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.155605 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.162619 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.163296 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.171583 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.182004 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.201038 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.213823 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.222412 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.233553 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.241685 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.265523 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.280992 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.302205 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.302497 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bfn44"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.330172 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.344396 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.361035 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.381105 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.413975 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.435916 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.457359 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.461186 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-vb9qx"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.474084 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.480769 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.494756 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vb9qx" event={"ID":"270031fa-3d83-4edf-bb5d-19ce9e1a693d","Type":"ContainerStarted","Data":"8b8722347bac291c780b1fdd439b9d8743876830528929d7bd120c0953d14111"} Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.494814 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" event={"ID":"55b6dd11-c219-4b24-90eb-dbc096a67835","Type":"ContainerStarted","Data":"75436a3df44b0e6d0f107c0ad8e4ab6be3973a98a11e68d34310dc5eea08705c"} Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.494828 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" event={"ID":"49e5e304-df7c-434b-8b17-f520e9bb7d52","Type":"ContainerStarted","Data":"76a0ae2a34847f15c5c8c0585f4a7daf639c1bba56be8e5de82c8ced6f6edcac"} Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.500337 4926 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.520530 4926 request.go:700] Waited for 1.929203658s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.523345 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.559161 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrhq\" (UniqueName: \"kubernetes.io/projected/1dc82997-2782-4e9e-a293-956fcb96acde-kube-api-access-xfrhq\") pod \"dns-operator-744455d44c-9zg78\" (UID: \"1dc82997-2782-4e9e-a293-956fcb96acde\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.596292 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjftj\" (UniqueName: \"kubernetes.io/projected/7ee39302-0316-4481-871e-538ffd31a507-kube-api-access-tjftj\") pod \"console-operator-58897d9998-s9tvm\" (UID: \"7ee39302-0316-4481-871e-538ffd31a507\") " pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.604268 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct49b\" (UniqueName: \"kubernetes.io/projected/e379fe1d-7780-4f17-8df8-f74f3dddbc23-kube-api-access-ct49b\") pod \"router-default-5444994796-wndvq\" (UID: \"e379fe1d-7780-4f17-8df8-f74f3dddbc23\") " pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.615299 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cae3622-73dc-43ee-9e5c-eb6c67e37c1e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6hxxs\" (UID: \"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.620467 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.626720 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2"] Mar 12 18:06:24 crc kubenswrapper[4926]: W0312 18:06:24.637021 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c30a7ad_f92b_445a_9201_fe55f247cf41.slice/crio-b55657cb73acd5feec151622d460258d2ed79fb0c98212b3ade5d94e9bbe6b62 WatchSource:0}: Error finding container b55657cb73acd5feec151622d460258d2ed79fb0c98212b3ade5d94e9bbe6b62: Status 404 returned error can't find the container with id b55657cb73acd5feec151622d460258d2ed79fb0c98212b3ade5d94e9bbe6b62 Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.640959 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.660235 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.682877 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.698861 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.699586 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.701595 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.706824 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z95pp"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.707868 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-qs58t"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.713240 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wsbc"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.716200 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7rmsn"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.720598 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.731917 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.740568 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.776318 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.828579 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.856515 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.856743 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.858862 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.882926 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-bound-sa-token\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.882955 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wq55\" (UniqueName: \"kubernetes.io/projected/f6ad9439-d12c-4987-840a-002975ba1498-kube-api-access-4wq55\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.882972 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883007 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-serving-cert\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883022 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f68a2b5d-036a-43ee-a9f2-dd94d9f51d51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bsvxw\" (UID: \"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883046 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-srv-cert\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883061 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4867106-501e-4408-af0b-7790cfc45a24-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883098 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb73641e-b156-470d-be1a-52c11f2efdf6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883119 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4wwg\" (UniqueName: \"kubernetes.io/projected/fb73641e-b156-470d-be1a-52c11f2efdf6-kube-api-access-d4wwg\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883135 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5212edbb-b23e-44b8-aab1-77dece6d1415-serving-cert\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883170 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5212edbb-b23e-44b8-aab1-77dece6d1415-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883186 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlg4x\" (UniqueName: \"kubernetes.io/projected/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-kube-api-access-nlg4x\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883209 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883223 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-config\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883248 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvfj\" (UniqueName: \"kubernetes.io/projected/f68a2b5d-036a-43ee-a9f2-dd94d9f51d51-kube-api-access-8zvfj\") pod \"multus-admission-controller-857f4d67dd-bsvxw\" (UID: \"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883271 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a0d781-e388-41ae-878e-05a69d81c83e-config\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883292 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883317 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b0faa2-bcb2-417e-9065-3156860a8644-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883333 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883367 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6ad9439-d12c-4987-840a-002975ba1498-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883388 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883403 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb73641e-b156-470d-be1a-52c11f2efdf6-trusted-ca\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883418 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931c621-236d-4a2f-9c96-2c29483f19db-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883458 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-trusted-ca\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883474 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-registry-tls\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883488 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b0faa2-bcb2-417e-9065-3156860a8644-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883537 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a931c621-236d-4a2f-9c96-2c29483f19db-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883554 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2x5w\" (UniqueName: \"kubernetes.io/projected/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-kube-api-access-m2x5w\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883592 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdq7x\" (UniqueName: \"kubernetes.io/projected/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-kube-api-access-vdq7x\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883606 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a0d781-e388-41ae-878e-05a69d81c83e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883638 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4867106-501e-4408-af0b-7790cfc45a24-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883653 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a0d781-e388-41ae-878e-05a69d81c83e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883671 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a931c621-236d-4a2f-9c96-2c29483f19db-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883688 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gdc\" (UniqueName: \"kubernetes.io/projected/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-kube-api-access-x4gdc\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883706 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d073c88-4608-4594-9feb-f1093455368d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sgl9z\" (UID: \"5d073c88-4608-4594-9feb-f1093455368d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883721 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6ad9439-d12c-4987-840a-002975ba1498-srv-cert\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883735 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htq94\" (UniqueName: \"kubernetes.io/projected/f4867106-501e-4408-af0b-7790cfc45a24-kube-api-access-htq94\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883750 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4867106-501e-4408-af0b-7790cfc45a24-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883785 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjs6\" (UniqueName: \"kubernetes.io/projected/439a0925-eff6-4da1-b65c-de0623b9daaa-kube-api-access-wpjs6\") pod \"migrator-59844c95c7-sbn5h\" (UID: \"439a0925-eff6-4da1-b65c-de0623b9daaa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883801 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrh7w\" (UniqueName: \"kubernetes.io/projected/5d073c88-4608-4594-9feb-f1093455368d-kube-api-access-nrh7w\") pod \"control-plane-machine-set-operator-78cbb6b69f-sgl9z\" (UID: \"5d073c88-4608-4594-9feb-f1093455368d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883814 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-profile-collector-cert\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883828 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltlt7\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-kube-api-access-ltlt7\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883861 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883876 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883892 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqbj\" (UniqueName: \"kubernetes.io/projected/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-kube-api-access-5bqbj\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883909 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-client-ca\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883931 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-registry-certificates\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883973 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-proxy-tls\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.883989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blwnd\" (UniqueName: \"kubernetes.io/projected/06394469-defd-4710-90a2-b6c395c00d4f-kube-api-access-blwnd\") pod \"downloads-7954f5f757-st67x\" (UID: \"06394469-defd-4710-90a2-b6c395c00d4f\") " pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.884004 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-images\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.884018 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb73641e-b156-470d-be1a-52c11f2efdf6-metrics-tls\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.884031 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92cqr\" (UniqueName: \"kubernetes.io/projected/5212edbb-b23e-44b8-aab1-77dece6d1415-kube-api-access-92cqr\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:24 crc kubenswrapper[4926]: E0312 18:06:24.886785 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:25.386769638 +0000 UTC m=+225.755395971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:24 crc kubenswrapper[4926]: W0312 18:06:24.915842 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4230f869_9456_44a1_87b3_342fc8c18ed7.slice/crio-01692fb622a525e9c42adef3bc2dd9239f99fa47f15b51038978d9cd4978f5eb WatchSource:0}: Error finding container 01692fb622a525e9c42adef3bc2dd9239f99fa47f15b51038978d9cd4978f5eb: Status 404 returned error can't find the container with id 01692fb622a525e9c42adef3bc2dd9239f99fa47f15b51038978d9cd4978f5eb Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.944204 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-s9tvm"] Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.984526 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985107 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-srv-cert\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985129 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4867106-501e-4408-af0b-7790cfc45a24-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985153 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-webhook-cert\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985177 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/416787ca-b62c-4fd5-84c5-57be59317faa-kube-api-access-rxlnr\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985214 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb73641e-b156-470d-be1a-52c11f2efdf6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985233 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-config-volume\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985250 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4wwg\" (UniqueName: \"kubernetes.io/projected/fb73641e-b156-470d-be1a-52c11f2efdf6-kube-api-access-d4wwg\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985268 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5212edbb-b23e-44b8-aab1-77dece6d1415-serving-cert\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985285 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzfp\" (UniqueName: \"kubernetes.io/projected/721fd54d-55f5-4477-a7b8-283a700c8c47-kube-api-access-8nzfp\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985302 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4edd042a-f910-49c2-9220-56a6e79b04dc-config-volume\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985319 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlg4x\" (UniqueName: \"kubernetes.io/projected/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-kube-api-access-nlg4x\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985335 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-plugins-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985358 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5212edbb-b23e-44b8-aab1-77dece6d1415-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985374 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/721fd54d-55f5-4477-a7b8-283a700c8c47-signing-cabundle\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985391 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985406 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-config\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985430 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvfj\" (UniqueName: \"kubernetes.io/projected/f68a2b5d-036a-43ee-a9f2-dd94d9f51d51-kube-api-access-8zvfj\") pod \"multus-admission-controller-857f4d67dd-bsvxw\" (UID: \"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985462 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a0d781-e388-41ae-878e-05a69d81c83e-config\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985491 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9lz\" (UniqueName: \"kubernetes.io/projected/4edd042a-f910-49c2-9220-56a6e79b04dc-kube-api-access-hq9lz\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985512 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f14f5e4-148e-4282-9243-89c96906048a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jfb58\" (UID: \"0f14f5e4-148e-4282-9243-89c96906048a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985544 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6gw4\" (UniqueName: \"kubernetes.io/projected/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-kube-api-access-b6gw4\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985559 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtltr\" (UniqueName: \"kubernetes.io/projected/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-kube-api-access-rtltr\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985583 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-socket-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985597 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c574438a-86a5-4a9f-aff3-47bf920cbabd-cert\") pod \"ingress-canary-pd4lm\" (UID: \"c574438a-86a5-4a9f-aff3-47bf920cbabd\") " pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985615 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b0faa2-bcb2-417e-9065-3156860a8644-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985637 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985661 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d8f4\" (UniqueName: \"kubernetes.io/projected/c574438a-86a5-4a9f-aff3-47bf920cbabd-kube-api-access-2d8f4\") pod \"ingress-canary-pd4lm\" (UID: \"c574438a-86a5-4a9f-aff3-47bf920cbabd\") " pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985678 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6ad9439-d12c-4987-840a-002975ba1498-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985705 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/721fd54d-55f5-4477-a7b8-283a700c8c47-signing-key\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985720 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985735 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb73641e-b156-470d-be1a-52c11f2efdf6-trusted-ca\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985758 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931c621-236d-4a2f-9c96-2c29483f19db-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985784 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-trusted-ca\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985799 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-registration-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985814 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4edd042a-f910-49c2-9220-56a6e79b04dc-secret-volume\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985829 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-registry-tls\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985843 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b0faa2-bcb2-417e-9065-3156860a8644-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985893 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a931c621-236d-4a2f-9c96-2c29483f19db-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985908 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-apiservice-cert\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985941 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2x5w\" (UniqueName: \"kubernetes.io/projected/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-kube-api-access-m2x5w\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985957 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbdk\" (UniqueName: \"kubernetes.io/projected/8acca805-2bf8-4dcb-a036-c4732084210e-kube-api-access-4xbdk\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.985989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-mountpoint-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986003 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00918e77-d6a1-4d1b-9986-37f3e648b322-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986020 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdq7x\" (UniqueName: \"kubernetes.io/projected/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-kube-api-access-vdq7x\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986035 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a0d781-e388-41ae-878e-05a69d81c83e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986053 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4867106-501e-4408-af0b-7790cfc45a24-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986068 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wx8l\" (UniqueName: \"kubernetes.io/projected/00918e77-d6a1-4d1b-9986-37f3e648b322-kube-api-access-2wx8l\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986083 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a0d781-e388-41ae-878e-05a69d81c83e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986098 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-metrics-tls\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986114 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a931c621-236d-4a2f-9c96-2c29483f19db-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986146 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gdc\" (UniqueName: \"kubernetes.io/projected/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-kube-api-access-x4gdc\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986172 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d073c88-4608-4594-9feb-f1093455368d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sgl9z\" (UID: \"5d073c88-4608-4594-9feb-f1093455368d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986189 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrq7x\" (UniqueName: \"kubernetes.io/projected/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-kube-api-access-nrq7x\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986207 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6ad9439-d12c-4987-840a-002975ba1498-srv-cert\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986222 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htq94\" (UniqueName: \"kubernetes.io/projected/f4867106-501e-4408-af0b-7790cfc45a24-kube-api-access-htq94\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986245 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4867106-501e-4408-af0b-7790cfc45a24-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986261 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00918e77-d6a1-4d1b-9986-37f3e648b322-proxy-tls\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986275 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pshhr\" (UniqueName: \"kubernetes.io/projected/0f14f5e4-148e-4282-9243-89c96906048a-kube-api-access-pshhr\") pod \"package-server-manager-789f6589d5-jfb58\" (UID: \"0f14f5e4-148e-4282-9243-89c96906048a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986293 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjs6\" (UniqueName: \"kubernetes.io/projected/439a0925-eff6-4da1-b65c-de0623b9daaa-kube-api-access-wpjs6\") pod \"migrator-59844c95c7-sbn5h\" (UID: \"439a0925-eff6-4da1-b65c-de0623b9daaa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986310 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/416787ca-b62c-4fd5-84c5-57be59317faa-serving-cert\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986326 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416787ca-b62c-4fd5-84c5-57be59317faa-config\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986352 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrh7w\" (UniqueName: \"kubernetes.io/projected/5d073c88-4608-4594-9feb-f1093455368d-kube-api-access-nrh7w\") pod \"control-plane-machine-set-operator-78cbb6b69f-sgl9z\" (UID: \"5d073c88-4608-4594-9feb-f1093455368d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986368 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-profile-collector-cert\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986401 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltlt7\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-kube-api-access-ltlt7\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986419 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-node-bootstrap-token\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986830 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986851 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986868 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-client-ca\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986883 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqbj\" (UniqueName: \"kubernetes.io/projected/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-kube-api-access-5bqbj\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986899 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-tmpfs\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986915 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-registry-certificates\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986930 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl7pp\" (UniqueName: \"kubernetes.io/projected/d68160cf-4e6c-4294-bfdc-4acb74637ecb-kube-api-access-tl7pp\") pod \"auto-csr-approver-29555646-wqpkb\" (UID: \"d68160cf-4e6c-4294-bfdc-4acb74637ecb\") " pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986971 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-proxy-tls\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.986987 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blwnd\" (UniqueName: \"kubernetes.io/projected/06394469-defd-4710-90a2-b6c395c00d4f-kube-api-access-blwnd\") pod \"downloads-7954f5f757-st67x\" (UID: \"06394469-defd-4710-90a2-b6c395c00d4f\") " pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987004 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-images\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987039 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb73641e-b156-470d-be1a-52c11f2efdf6-metrics-tls\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987055 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92cqr\" (UniqueName: \"kubernetes.io/projected/5212edbb-b23e-44b8-aab1-77dece6d1415-kube-api-access-92cqr\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987083 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-bound-sa-token\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987097 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-certs\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987141 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wq55\" (UniqueName: \"kubernetes.io/projected/f6ad9439-d12c-4987-840a-002975ba1498-kube-api-access-4wq55\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987157 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-csi-data-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987183 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987235 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-serving-cert\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:24 crc kubenswrapper[4926]: I0312 18:06:24.987263 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f68a2b5d-036a-43ee-a9f2-dd94d9f51d51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bsvxw\" (UID: \"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:24.996497 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:25.496463123 +0000 UTC m=+225.865089466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:24.998184 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f68a2b5d-036a-43ee-a9f2-dd94d9f51d51-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bsvxw\" (UID: \"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.000147 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-srv-cert\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.003690 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.004651 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.009028 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b0faa2-bcb2-417e-9065-3156860a8644-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.011492 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b0faa2-bcb2-417e-9065-3156860a8644-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.011803 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4867106-501e-4408-af0b-7790cfc45a24-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.012619 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a931c621-236d-4a2f-9c96-2c29483f19db-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.013495 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.013525 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a0d781-e388-41ae-878e-05a69d81c83e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.013891 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.014166 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-trusted-ca\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.014424 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a0d781-e388-41ae-878e-05a69d81c83e-config\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.014571 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931c621-236d-4a2f-9c96-2c29483f19db-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.015287 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-client-ca\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.015419 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5212edbb-b23e-44b8-aab1-77dece6d1415-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.015592 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5212edbb-b23e-44b8-aab1-77dece6d1415-serving-cert\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.015749 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb73641e-b156-470d-be1a-52c11f2efdf6-trusted-ca\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.016181 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-registry-certificates\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.016717 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.019209 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-config\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.019418 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-images\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.021533 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d073c88-4608-4594-9feb-f1093455368d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sgl9z\" (UID: \"5d073c88-4608-4594-9feb-f1093455368d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.021996 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-profile-collector-cert\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.024385 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9zg78"] Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.029677 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4867106-501e-4408-af0b-7790cfc45a24-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.029710 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6ad9439-d12c-4987-840a-002975ba1498-srv-cert\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.029849 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-serving-cert\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.030156 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-proxy-tls\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.031054 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6ad9439-d12c-4987-840a-002975ba1498-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.032828 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.038191 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-registry-tls\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.038645 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb73641e-b156-470d-be1a-52c11f2efdf6-metrics-tls\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.039381 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92cqr\" (UniqueName: \"kubernetes.io/projected/5212edbb-b23e-44b8-aab1-77dece6d1415-kube-api-access-92cqr\") pod \"openshift-config-operator-7777fb866f-h95sl\" (UID: \"5212edbb-b23e-44b8-aab1-77dece6d1415\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.058310 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdq7x\" (UniqueName: \"kubernetes.io/projected/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-kube-api-access-vdq7x\") pod \"marketplace-operator-79b997595-c68kr\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.079533 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a0d781-e388-41ae-878e-05a69d81c83e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-sj8r7\" (UID: \"36a0d781-e388-41ae-878e-05a69d81c83e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.082987 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089585 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-webhook-cert\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089620 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/416787ca-b62c-4fd5-84c5-57be59317faa-kube-api-access-rxlnr\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089646 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-config-volume\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089670 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzfp\" (UniqueName: \"kubernetes.io/projected/721fd54d-55f5-4477-a7b8-283a700c8c47-kube-api-access-8nzfp\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089686 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4edd042a-f910-49c2-9220-56a6e79b04dc-config-volume\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089708 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-plugins-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089726 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/721fd54d-55f5-4477-a7b8-283a700c8c47-signing-cabundle\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089755 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9lz\" (UniqueName: \"kubernetes.io/projected/4edd042a-f910-49c2-9220-56a6e79b04dc-kube-api-access-hq9lz\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089774 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089789 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f14f5e4-148e-4282-9243-89c96906048a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jfb58\" (UID: \"0f14f5e4-148e-4282-9243-89c96906048a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089806 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6gw4\" (UniqueName: \"kubernetes.io/projected/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-kube-api-access-b6gw4\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089821 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtltr\" (UniqueName: \"kubernetes.io/projected/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-kube-api-access-rtltr\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089836 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-socket-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089854 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c574438a-86a5-4a9f-aff3-47bf920cbabd-cert\") pod \"ingress-canary-pd4lm\" (UID: \"c574438a-86a5-4a9f-aff3-47bf920cbabd\") " pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089871 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d8f4\" (UniqueName: \"kubernetes.io/projected/c574438a-86a5-4a9f-aff3-47bf920cbabd-kube-api-access-2d8f4\") pod \"ingress-canary-pd4lm\" (UID: \"c574438a-86a5-4a9f-aff3-47bf920cbabd\") " pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089887 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/721fd54d-55f5-4477-a7b8-283a700c8c47-signing-key\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089903 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-registration-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089917 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4edd042a-f910-49c2-9220-56a6e79b04dc-secret-volume\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089951 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-apiservice-cert\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089974 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbdk\" (UniqueName: \"kubernetes.io/projected/8acca805-2bf8-4dcb-a036-c4732084210e-kube-api-access-4xbdk\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.089994 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-mountpoint-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090014 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00918e77-d6a1-4d1b-9986-37f3e648b322-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090047 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wx8l\" (UniqueName: \"kubernetes.io/projected/00918e77-d6a1-4d1b-9986-37f3e648b322-kube-api-access-2wx8l\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090072 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-metrics-tls\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090100 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrq7x\" (UniqueName: \"kubernetes.io/projected/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-kube-api-access-nrq7x\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090120 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00918e77-d6a1-4d1b-9986-37f3e648b322-proxy-tls\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090134 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pshhr\" (UniqueName: \"kubernetes.io/projected/0f14f5e4-148e-4282-9243-89c96906048a-kube-api-access-pshhr\") pod \"package-server-manager-789f6589d5-jfb58\" (UID: \"0f14f5e4-148e-4282-9243-89c96906048a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090160 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/416787ca-b62c-4fd5-84c5-57be59317faa-serving-cert\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090179 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416787ca-b62c-4fd5-84c5-57be59317faa-config\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090202 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-node-bootstrap-token\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090233 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-tmpfs\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090248 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl7pp\" (UniqueName: \"kubernetes.io/projected/d68160cf-4e6c-4294-bfdc-4acb74637ecb-kube-api-access-tl7pp\") pod \"auto-csr-approver-29555646-wqpkb\" (UID: \"d68160cf-4e6c-4294-bfdc-4acb74637ecb\") " pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090281 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-certs\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090298 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-csi-data-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.090409 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-csi-data-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.091135 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-config-volume\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.091185 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4edd042a-f910-49c2-9220-56a6e79b04dc-config-volume\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.093969 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:25.593957707 +0000 UTC m=+225.962584040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.094392 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-plugins-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.094910 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-mountpoint-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.095191 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/721fd54d-55f5-4477-a7b8-283a700c8c47-signing-cabundle\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.095284 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-registration-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.095320 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8acca805-2bf8-4dcb-a036-c4732084210e-socket-dir\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.096009 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00918e77-d6a1-4d1b-9986-37f3e648b322-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.097075 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs"] Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.097085 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/416787ca-b62c-4fd5-84c5-57be59317faa-config\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.097353 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-tmpfs\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.097388 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c574438a-86a5-4a9f-aff3-47bf920cbabd-cert\") pod \"ingress-canary-pd4lm\" (UID: \"c574438a-86a5-4a9f-aff3-47bf920cbabd\") " pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.099302 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/721fd54d-55f5-4477-a7b8-283a700c8c47-signing-key\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.099966 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-webhook-cert\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.100690 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00918e77-d6a1-4d1b-9986-37f3e648b322-proxy-tls\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.101321 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4edd042a-f910-49c2-9220-56a6e79b04dc-secret-volume\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.104239 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-apiservice-cert\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.104424 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f14f5e4-148e-4282-9243-89c96906048a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jfb58\" (UID: \"0f14f5e4-148e-4282-9243-89c96906048a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.105997 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4867106-501e-4408-af0b-7790cfc45a24-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.106268 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-node-bootstrap-token\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.110904 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/416787ca-b62c-4fd5-84c5-57be59317faa-serving-cert\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.111013 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-certs\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.114513 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-metrics-tls\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.124035 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-bound-sa-token\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.143217 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a931c621-236d-4a2f-9c96-2c29483f19db-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b9s9h\" (UID: \"a931c621-236d-4a2f-9c96-2c29483f19db\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.160326 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gdc\" (UniqueName: \"kubernetes.io/projected/f54babbf-5b93-4ef7-99d7-0cfe05e921c5-kube-api-access-x4gdc\") pod \"kube-storage-version-migrator-operator-b67b599dd-snpzd\" (UID: \"f54babbf-5b93-4ef7-99d7-0cfe05e921c5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.186758 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.189958 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.200682 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.201456 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:25.70142186 +0000 UTC m=+226.070048193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.206330 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.217348 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wq55\" (UniqueName: \"kubernetes.io/projected/f6ad9439-d12c-4987-840a-002975ba1498-kube-api-access-4wq55\") pod \"olm-operator-6b444d44fb-x4kkc\" (UID: \"f6ad9439-d12c-4987-840a-002975ba1498\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.230780 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htq94\" (UniqueName: \"kubernetes.io/projected/f4867106-501e-4408-af0b-7790cfc45a24-kube-api-access-htq94\") pod \"cluster-image-registry-operator-dc59b4c8b-xxzt9\" (UID: \"f4867106-501e-4408-af0b-7790cfc45a24\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.236746 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2x5w\" (UniqueName: \"kubernetes.io/projected/b99fa2a7-b47e-433c-b6d5-7c1306a79cde-kube-api-access-m2x5w\") pod \"catalog-operator-68c6474976-rdjk9\" (UID: \"b99fa2a7-b47e-433c-b6d5-7c1306a79cde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.257096 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb73641e-b156-470d-be1a-52c11f2efdf6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.275814 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4wwg\" (UniqueName: \"kubernetes.io/projected/fb73641e-b156-470d-be1a-52c11f2efdf6-kube-api-access-d4wwg\") pod \"ingress-operator-5b745b69d9-h5mf8\" (UID: \"fb73641e-b156-470d-be1a-52c11f2efdf6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.301021 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvfj\" (UniqueName: \"kubernetes.io/projected/f68a2b5d-036a-43ee-a9f2-dd94d9f51d51-kube-api-access-8zvfj\") pod \"multus-admission-controller-857f4d67dd-bsvxw\" (UID: \"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.302028 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h95sl"] Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.302970 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.303331 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:25.803319536 +0000 UTC m=+226.171945869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.318933 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjs6\" (UniqueName: \"kubernetes.io/projected/439a0925-eff6-4da1-b65c-de0623b9daaa-kube-api-access-wpjs6\") pod \"migrator-59844c95c7-sbn5h\" (UID: \"439a0925-eff6-4da1-b65c-de0623b9daaa\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.348877 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlg4x\" (UniqueName: \"kubernetes.io/projected/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-kube-api-access-nlg4x\") pod \"controller-manager-879f6c89f-q49l5\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.366247 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqbj\" (UniqueName: \"kubernetes.io/projected/2e2b819b-3df8-4d51-b6d0-b30e33f1ceba-kube-api-access-5bqbj\") pod \"machine-config-operator-74547568cd-bmp6d\" (UID: \"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.388822 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrh7w\" (UniqueName: \"kubernetes.io/projected/5d073c88-4608-4594-9feb-f1093455368d-kube-api-access-nrh7w\") pod \"control-plane-machine-set-operator-78cbb6b69f-sgl9z\" (UID: \"5d073c88-4608-4594-9feb-f1093455368d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.399066 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.402291 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltlt7\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-kube-api-access-ltlt7\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.406325 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.406792 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:25.906774808 +0000 UTC m=+226.275401141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.407298 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.412520 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.422000 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.423895 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blwnd\" (UniqueName: \"kubernetes.io/projected/06394469-defd-4710-90a2-b6c395c00d4f-kube-api-access-blwnd\") pod \"downloads-7954f5f757-st67x\" (UID: \"06394469-defd-4710-90a2-b6c395c00d4f\") " pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.428517 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.436202 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.436841 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzfp\" (UniqueName: \"kubernetes.io/projected/721fd54d-55f5-4477-a7b8-283a700c8c47-kube-api-access-8nzfp\") pod \"service-ca-9c57cc56f-rx7tc\" (UID: \"721fd54d-55f5-4477-a7b8-283a700c8c47\") " pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.444741 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.455748 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wx8l\" (UniqueName: \"kubernetes.io/projected/00918e77-d6a1-4d1b-9986-37f3e648b322-kube-api-access-2wx8l\") pod \"machine-config-controller-84d6567774-dnk9m\" (UID: \"00918e77-d6a1-4d1b-9986-37f3e648b322\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.465889 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.478747 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxlnr\" (UniqueName: \"kubernetes.io/projected/416787ca-b62c-4fd5-84c5-57be59317faa-kube-api-access-rxlnr\") pod \"service-ca-operator-777779d784-z4g7g\" (UID: \"416787ca-b62c-4fd5-84c5-57be59317faa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.487892 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.499414 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.507390 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d8f4\" (UniqueName: \"kubernetes.io/projected/c574438a-86a5-4a9f-aff3-47bf920cbabd-kube-api-access-2d8f4\") pod \"ingress-canary-pd4lm\" (UID: \"c574438a-86a5-4a9f-aff3-47bf920cbabd\") " pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.516985 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbdk\" (UniqueName: \"kubernetes.io/projected/8acca805-2bf8-4dcb-a036-c4732084210e-kube-api-access-4xbdk\") pod \"csi-hostpathplugin-c9xl5\" (UID: \"8acca805-2bf8-4dcb-a036-c4732084210e\") " pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.523868 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.525556 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.525886 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.025870754 +0000 UTC m=+226.394497087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.533689 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.544059 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" event={"ID":"1dc82997-2782-4e9e-a293-956fcb96acde","Type":"ContainerStarted","Data":"45644e365c4c4c529f9a87c6adeb38c71d01569ba997e441f62dc7542cad62f8"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.544105 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" event={"ID":"1dc82997-2782-4e9e-a293-956fcb96acde","Type":"ContainerStarted","Data":"a481d30dbd54dc572c52e184a6bde6b513a6eb7b1a50a62d5566922da33a686d"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.546786 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtltr\" (UniqueName: \"kubernetes.io/projected/4c9eaabb-c0c2-4f71-82eb-7494d0a37075-kube-api-access-rtltr\") pod \"dns-default-rxml7\" (UID: \"4c9eaabb-c0c2-4f71-82eb-7494d0a37075\") " pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.561571 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" event={"ID":"5db62dca-ba86-4ca4-861e-003d09e5ac0f","Type":"ContainerStarted","Data":"d6d9c79e93d5134e8d839538f09145ae2bd4d77bbe3f38c3f1f15cf37a650938"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.561616 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" event={"ID":"5db62dca-ba86-4ca4-861e-003d09e5ac0f","Type":"ContainerStarted","Data":"0182f53fcf7be43817f7e2563d3881425c615ed6aa5302372ea5d8c19324879d"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.561629 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" event={"ID":"5db62dca-ba86-4ca4-861e-003d09e5ac0f","Type":"ContainerStarted","Data":"ad09cbb8919dc168feec360f061267f9c6dab1c238365c26b99a91802e4a61ea"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.578651 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6gw4\" (UniqueName: \"kubernetes.io/projected/6aa7a74a-b3d4-47df-a1b1-c83a5639e592-kube-api-access-b6gw4\") pod \"packageserver-d55dfcdfc-nhq52\" (UID: \"6aa7a74a-b3d4-47df-a1b1-c83a5639e592\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.581048 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9lz\" (UniqueName: \"kubernetes.io/projected/4edd042a-f910-49c2-9220-56a6e79b04dc-kube-api-access-hq9lz\") pod \"collect-profiles-29555640-5965f\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.583945 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c68kr"] Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.584740 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.591797 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.596841 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pshhr\" (UniqueName: \"kubernetes.io/projected/0f14f5e4-148e-4282-9243-89c96906048a-kube-api-access-pshhr\") pod \"package-server-manager-789f6589d5-jfb58\" (UID: \"0f14f5e4-148e-4282-9243-89c96906048a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.603570 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.606527 4926 generic.go:334] "Generic (PLEG): container finished" podID="48a0fa25-6b2d-4668-b8e5-824912077f19" containerID="eca16552cabc66a957cf8d61cb7d1632766bc219c67a7d16dae3bf783cc793a0" exitCode=0 Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.606593 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" event={"ID":"48a0fa25-6b2d-4668-b8e5-824912077f19","Type":"ContainerDied","Data":"eca16552cabc66a957cf8d61cb7d1632766bc219c67a7d16dae3bf783cc793a0"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.606618 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" event={"ID":"48a0fa25-6b2d-4668-b8e5-824912077f19","Type":"ContainerStarted","Data":"7a5c5c5c226a6e688b1683c800ce32947a83286c8aff57cfd8b310a771f82f11"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.614768 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" event={"ID":"7ee39302-0316-4481-871e-538ffd31a507","Type":"ContainerStarted","Data":"e024d0e53c7fa71e7e203c8c417d25cff769541c875faeb7d565736258629fed"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.614811 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" event={"ID":"7ee39302-0316-4481-871e-538ffd31a507","Type":"ContainerStarted","Data":"83e3807119b1915ea1314bd29b524a6cebc9ce445f044b1fd3296040625c0c21"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.615673 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.618634 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" event={"ID":"eedd886d-5443-47e1-afbf-5aff90067f3b","Type":"ContainerStarted","Data":"9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.618676 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" event={"ID":"eedd886d-5443-47e1-afbf-5aff90067f3b","Type":"ContainerStarted","Data":"f375d41ba80f860b9235fe8ebb0d083845e27083ae0a2e004437a84b6b1f824e"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.619582 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.625327 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrq7x\" (UniqueName: \"kubernetes.io/projected/a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b-kube-api-access-nrq7x\") pod \"machine-config-server-drmq2\" (UID: \"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b\") " pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.628880 4926 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7rmsn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.628933 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" podUID="eedd886d-5443-47e1-afbf-5aff90067f3b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.629224 4926 patch_prober.go:28] interesting pod/console-operator-58897d9998-s9tvm container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.629244 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" podUID="7ee39302-0316-4481-871e-538ffd31a507" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.629465 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.629768 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.129748876 +0000 UTC m=+226.498375209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.640169 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pd4lm" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.647903 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wndvq" event={"ID":"e379fe1d-7780-4f17-8df8-f74f3dddbc23","Type":"ContainerStarted","Data":"251116173e0184dabcae9f1470dfeba148eb84e6d6fd0d2f14cd2177ac791148"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.647940 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wndvq" event={"ID":"e379fe1d-7780-4f17-8df8-f74f3dddbc23","Type":"ContainerStarted","Data":"8ecdaf5b859f4687483dedd414756336f848c04f6ad974e42c6606b8128fe624"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.659561 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7"] Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.669321 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl7pp\" (UniqueName: \"kubernetes.io/projected/d68160cf-4e6c-4294-bfdc-4acb74637ecb-kube-api-access-tl7pp\") pod \"auto-csr-approver-29555646-wqpkb\" (UID: \"d68160cf-4e6c-4294-bfdc-4acb74637ecb\") " pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.670278 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.710934 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd"] Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.723704 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" event={"ID":"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e","Type":"ContainerStarted","Data":"25d4891e318db63842a21893f47245b10054234a14851c794f66da64eb9bf8d4"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.723796 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" event={"ID":"5cae3622-73dc-43ee-9e5c-eb6c67e37c1e","Type":"ContainerStarted","Data":"71e522f800a2f7ad15f4527ee0006ccf2a2a361eee826af72fe7e9f03810ff65"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.729641 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" event={"ID":"4230f869-9456-44a1-87b3-342fc8c18ed7","Type":"ContainerStarted","Data":"8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.729675 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" event={"ID":"4230f869-9456-44a1-87b3-342fc8c18ed7","Type":"ContainerStarted","Data":"01692fb622a525e9c42adef3bc2dd9239f99fa47f15b51038978d9cd4978f5eb"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.730286 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.730707 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.731512 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.23150033 +0000 UTC m=+226.600126663 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.738244 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" event={"ID":"5212edbb-b23e-44b8-aab1-77dece6d1415","Type":"ContainerStarted","Data":"028109e363337ff53ceee7c1b43dedf9da3bb3f4229d2026205509d87ba233d4"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.754683 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" event={"ID":"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a","Type":"ContainerStarted","Data":"0db594dcd1a582cafdacd02c1df7aaeaa57401d84254c36a469d20dcf1ceacb5"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.754744 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" event={"ID":"64a2a31c-e6c6-47da-8c6c-2f1ecc2cae0a","Type":"ContainerStarted","Data":"728c89fc3a0575b8e261e5e64c1d58ab08fc1123b8b8e394acde8c8ea8a93621"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.756816 4926 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-svrln container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.757146 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" podUID="4230f869-9456-44a1-87b3-342fc8c18ed7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.758659 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" event={"ID":"7e94e6e0-b16d-462f-b791-ba20acdcb809","Type":"ContainerStarted","Data":"67c4943e95a1c40606c31e36e0635b0bf125bbcde238649f4a46a977851332cc"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.768056 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" event={"ID":"8c30a7ad-f92b-445a-9201-fe55f247cf41","Type":"ContainerStarted","Data":"1a16a2f2c259d340eac988b4f95f87105acb91d2ac8dd2880085905b12018f2b"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.768092 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" event={"ID":"8c30a7ad-f92b-445a-9201-fe55f247cf41","Type":"ContainerStarted","Data":"b55657cb73acd5feec151622d460258d2ed79fb0c98212b3ade5d94e9bbe6b62"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.777331 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" event={"ID":"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca","Type":"ContainerStarted","Data":"7e99758c6494f5e0e5da39dab8d3b4fd2c2d8f5b3e5308d24dc5c55220bef34b"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.777374 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" event={"ID":"a5e9834d-5aeb-4154-ae70-c2b6b07c9eca","Type":"ContainerStarted","Data":"2bd1b4dbef2d5299c5d28d919ce85c08f87b2399a31de3d7c1b45884c90c69cc"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.783663 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" event={"ID":"49e5e304-df7c-434b-8b17-f520e9bb7d52","Type":"ContainerStarted","Data":"6d5e48f8f77cc22923aa8c316c194b01cdb7b43f3f11516201fa35dbdcef9239"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.783729 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" event={"ID":"49e5e304-df7c-434b-8b17-f520e9bb7d52","Type":"ContainerStarted","Data":"3d386874795ceeea475021dc8c00ce587b948725302ec554b6280b4d4f5600da"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.788022 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" event={"ID":"55b6dd11-c219-4b24-90eb-dbc096a67835","Type":"ContainerStarted","Data":"7562a0b82dc71e945dc3015814e8b06ec357f6e7d2119f7931077ec6cb598845"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.788113 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" event={"ID":"55b6dd11-c219-4b24-90eb-dbc096a67835","Type":"ContainerStarted","Data":"13e98918b67aa3a4604fb14453bb42264b92bf6862b1423d05244e57bc41de33"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.813805 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" event={"ID":"4c2f00a8-c3ce-4957-9093-d8c2cce49992","Type":"ContainerStarted","Data":"91cd7543854acab335d02d2b9cdb6610920076420f11b25cf3564733015dfdcd"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.813866 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" event={"ID":"4c2f00a8-c3ce-4957-9093-d8c2cce49992","Type":"ContainerStarted","Data":"46ec498673358f957eb27199406a9e3524f91fc376937de73c5dd4885e364b77"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.823215 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.842068 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.844799 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.344784224 +0000 UTC m=+226.713410547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.864104 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.866777 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.866852 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.867024 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.876044 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.879689 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.882929 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vb9qx" event={"ID":"270031fa-3d83-4edf-bb5d-19ce9e1a693d","Type":"ContainerStarted","Data":"32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b"} Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.913390 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-drmq2" Mar 12 18:06:25 crc kubenswrapper[4926]: I0312 18:06:25.945022 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:25 crc kubenswrapper[4926]: E0312 18:06:25.951008 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.450983951 +0000 UTC m=+226.819610284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.046898 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.048568 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.548545377 +0000 UTC m=+226.917171700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.123996 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.129360 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.133256 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q49l5"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.149593 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.150474 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.650422374 +0000 UTC m=+227.019048707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.168033 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.169672 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.179679 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.192468 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bsvxw"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.255628 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.256060 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.756042585 +0000 UTC m=+227.124668918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.354011 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38004: no serving certificate available for the kubelet" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.358969 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.359710 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.859695951 +0000 UTC m=+227.228322284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.438174 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-q2smg" podStartSLOduration=155.438147695 podStartE2EDuration="2m35.438147695s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:26.436893471 +0000 UTC m=+226.805519804" watchObservedRunningTime="2026-03-12 18:06:26.438147695 +0000 UTC m=+226.806774028" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.438291 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38016: no serving certificate available for the kubelet" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.476233 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.476924 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:26.976899904 +0000 UTC m=+227.345526237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.509822 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hwwdz" podStartSLOduration=155.509802961 podStartE2EDuration="2m35.509802961s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:26.473232342 +0000 UTC m=+226.841858675" watchObservedRunningTime="2026-03-12 18:06:26.509802961 +0000 UTC m=+226.878429294" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.534981 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38648: no serving certificate available for the kubelet" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.589377 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.589753 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.089739416 +0000 UTC m=+227.458365749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.609038 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x7fm2" podStartSLOduration=155.609018943 podStartE2EDuration="2m35.609018943s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:26.607898991 +0000 UTC m=+226.976525324" watchObservedRunningTime="2026-03-12 18:06:26.609018943 +0000 UTC m=+226.977645266" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.643021 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38664: no serving certificate available for the kubelet" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.692062 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.719154 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.219123089 +0000 UTC m=+227.587749422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.755761 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38668: no serving certificate available for the kubelet" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.818767 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.818853 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.819248 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.819103 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.319091482 +0000 UTC m=+227.687717815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.865274 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38678: no serving certificate available for the kubelet" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.868665 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wndvq" podStartSLOduration=155.868642801 podStartE2EDuration="2m35.868642801s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:26.810611456 +0000 UTC m=+227.179237809" watchObservedRunningTime="2026-03-12 18:06:26.868642801 +0000 UTC m=+227.237269134" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.876852 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:26 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:26 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:26 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.877035 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.897133 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" podStartSLOduration=155.897118545 podStartE2EDuration="2m35.897118545s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:26.894923734 +0000 UTC m=+227.263550067" watchObservedRunningTime="2026-03-12 18:06:26.897118545 +0000 UTC m=+227.265744878" Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.913087 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.922835 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.922914 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.422895883 +0000 UTC m=+227.791522216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.923572 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:26 crc kubenswrapper[4926]: E0312 18:06:26.924107 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.424085096 +0000 UTC m=+227.792711429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.941364 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9"] Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.949853 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" event={"ID":"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b","Type":"ContainerStarted","Data":"7a6f3ba59fc7e8583a370a6e58f7ad7026a2677d263e98044721d21503f15589"} Mar 12 18:06:26 crc kubenswrapper[4926]: I0312 18:06:26.974662 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qbxxz" podStartSLOduration=155.974636993 podStartE2EDuration="2m35.974636993s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:26.960205052 +0000 UTC m=+227.328831385" watchObservedRunningTime="2026-03-12 18:06:26.974636993 +0000 UTC m=+227.343263336" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.006704 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-drmq2" event={"ID":"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b","Type":"ContainerStarted","Data":"97bc5d944c909a6163942dd9455d7d8952973ac21f1d08511a5d86cdab39077e"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.016600 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38690: no serving certificate available for the kubelet" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.021293 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" event={"ID":"f54babbf-5b93-4ef7-99d7-0cfe05e921c5","Type":"ContainerStarted","Data":"bf119e4b4a98067cffceb4ed59a629041510e11eadc38d9a74a3cd8f3bfbd2a6"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.024577 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.024895 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.524879323 +0000 UTC m=+227.893505656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.037100 4926 generic.go:334] "Generic (PLEG): container finished" podID="7e94e6e0-b16d-462f-b791-ba20acdcb809" containerID="111f37543ab21017395918e0ccd1df09471d3fe16ac7e1da8c73a6cc9fe06a30" exitCode=0 Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.037288 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" event={"ID":"7e94e6e0-b16d-462f-b791-ba20acdcb809","Type":"ContainerDied","Data":"111f37543ab21017395918e0ccd1df09471d3fe16ac7e1da8c73a6cc9fe06a30"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.089643 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" event={"ID":"b99fa2a7-b47e-433c-b6d5-7c1306a79cde","Type":"ContainerStarted","Data":"46dff74c54b464b95a14e1b9b5858f50f013957acfa111c498373bbabe094b26"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.106290 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bfn44" podStartSLOduration=156.106265728 podStartE2EDuration="2m36.106265728s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.090070447 +0000 UTC m=+227.458696780" watchObservedRunningTime="2026-03-12 18:06:27.106265728 +0000 UTC m=+227.474892061" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.126609 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.127209 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.627184051 +0000 UTC m=+227.995810384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.144604 4926 generic.go:334] "Generic (PLEG): container finished" podID="5212edbb-b23e-44b8-aab1-77dece6d1415" containerID="0714bb4dcff4dc2428df4aeda6780222f34109de8c6b09d77bc70541c06afb17" exitCode=0 Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.145238 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" event={"ID":"5212edbb-b23e-44b8-aab1-77dece6d1415","Type":"ContainerDied","Data":"0714bb4dcff4dc2428df4aeda6780222f34109de8c6b09d77bc70541c06afb17"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.185947 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-st67x"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.191335 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" event={"ID":"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51","Type":"ContainerStarted","Data":"b9f1222ee65a38125a7545024c47563dcf35849967132a1c5311a2a6a6a7e885"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.205308 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" event={"ID":"a931c621-236d-4a2f-9c96-2c29483f19db","Type":"ContainerStarted","Data":"9de569ecb6b66b1d5452a6d02bfa27e9f6d654e0af62852910db34e88fe9bf9a"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.208212 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pd4lm"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.225526 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-vb9qx" podStartSLOduration=156.225504758 podStartE2EDuration="2m36.225504758s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.213882475 +0000 UTC m=+227.582508818" watchObservedRunningTime="2026-03-12 18:06:27.225504758 +0000 UTC m=+227.594131091" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.225649 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rxml7"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.235024 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.235704 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.735656381 +0000 UTC m=+228.104282724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.240679 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.252811 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" event={"ID":"439a0925-eff6-4da1-b65c-de0623b9daaa","Type":"ContainerStarted","Data":"70134ce23df9422d4c083805e0f21dd102ee5a93120340676eb68da71536b00c"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.289059 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.291475 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" event={"ID":"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98","Type":"ContainerStarted","Data":"0e10ddf44702564bdb0735ff145d6f01e01df0a99bcf4c2caaa2097751e18b0f"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.300679 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2wsbc" podStartSLOduration=156.300660051 podStartE2EDuration="2m36.300660051s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.291044613 +0000 UTC m=+227.659670946" watchObservedRunningTime="2026-03-12 18:06:27.300660051 +0000 UTC m=+227.669286394" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.315090 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" event={"ID":"1dc82997-2782-4e9e-a293-956fcb96acde","Type":"ContainerStarted","Data":"806e08739a54e95bdb8abf4febfb3f8c2188846a99469ff5f9e5ae8b483c7d80"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.332150 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" podStartSLOduration=156.332133337 podStartE2EDuration="2m36.332133337s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.328871616 +0000 UTC m=+227.697497949" watchObservedRunningTime="2026-03-12 18:06:27.332133337 +0000 UTC m=+227.700759670" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.337405 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.337517 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rx7tc"] Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.337722 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.837711723 +0000 UTC m=+228.206338056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.339208 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.363784 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c9xl5"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.379216 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-qs58t" podStartSLOduration=156.379198198 podStartE2EDuration="2m36.379198198s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.378468287 +0000 UTC m=+227.747094620" watchObservedRunningTime="2026-03-12 18:06:27.379198198 +0000 UTC m=+227.747824531" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.384946 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38700: no serving certificate available for the kubelet" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.412836 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555646-wqpkb"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.443801 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.443988 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.943957171 +0000 UTC m=+228.312583494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.444258 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.445169 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:27.945158284 +0000 UTC m=+228.313784617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.456608 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" event={"ID":"fb73641e-b156-470d-be1a-52c11f2efdf6","Type":"ContainerStarted","Data":"ab450526f0818681745b07bb50413cd15f837f3f97782151907eedba15623950"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.460095 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.462426 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.463892 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" event={"ID":"36a0d781-e388-41ae-878e-05a69d81c83e","Type":"ContainerStarted","Data":"eb51896bc46a5a937f2c8797ccaef53c9c5e609a899b6bd2d5834f33c221ee00"} Mar 12 18:06:27 crc kubenswrapper[4926]: W0312 18:06:27.528594 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd68160cf_4e6c_4294_bfdc_4acb74637ecb.slice/crio-71f803a395609858e2670b2b53c7204676afa8a634426e199a4c62b07d977042 WatchSource:0}: Error finding container 71f803a395609858e2670b2b53c7204676afa8a634426e199a4c62b07d977042: Status 404 returned error can't find the container with id 71f803a395609858e2670b2b53c7204676afa8a634426e199a4c62b07d977042 Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.533014 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" podStartSLOduration=156.53299947 podStartE2EDuration="2m36.53299947s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.50640379 +0000 UTC m=+227.875030123" watchObservedRunningTime="2026-03-12 18:06:27.53299947 +0000 UTC m=+227.901625803" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.533131 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" event={"ID":"5d073c88-4608-4594-9feb-f1093455368d","Type":"ContainerStarted","Data":"c6f7764c266bdcce29b8bcda278c7f93167606bdf942f0af18206615b3309291"} Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.541175 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58"] Mar 12 18:06:27 crc kubenswrapper[4926]: W0312 18:06:27.540779 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8acca805_2bf8_4dcb_a036_c4732084210e.slice/crio-8b631f999700f2f51aa43e2ed2f5b2edb1d58820b4c614d63e8892b10d5e4413 WatchSource:0}: Error finding container 8b631f999700f2f51aa43e2ed2f5b2edb1d58820b4c614d63e8892b10d5e4413: Status 404 returned error can't find the container with id 8b631f999700f2f51aa43e2ed2f5b2edb1d58820b4c614d63e8892b10d5e4413 Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.545365 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.545953 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.04593483 +0000 UTC m=+228.414561163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.553961 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.553971 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.554410 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.602743 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.651058 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.659216 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6hxxs" podStartSLOduration=156.659192684 podStartE2EDuration="2m36.659192684s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.652290001 +0000 UTC m=+228.020916334" watchObservedRunningTime="2026-03-12 18:06:27.659192684 +0000 UTC m=+228.027819017" Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.660809 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.160793038 +0000 UTC m=+228.529419371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.707586 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q49l5"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.738717 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln"] Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.752774 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.753130 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.253113469 +0000 UTC m=+228.621739802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.798271 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9zg78" podStartSLOduration=156.798214995 podStartE2EDuration="2m36.798214995s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:27.79264997 +0000 UTC m=+228.161276323" watchObservedRunningTime="2026-03-12 18:06:27.798214995 +0000 UTC m=+228.166841328" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.855374 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.856257 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.356242561 +0000 UTC m=+228.724868894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.868749 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:27 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:27 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:27 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.868835 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.956350 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.956526 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.456490552 +0000 UTC m=+228.825116885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:27 crc kubenswrapper[4926]: I0312 18:06:27.957224 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:27 crc kubenswrapper[4926]: E0312 18:06:27.957575 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.457563421 +0000 UTC m=+228.826189744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.058608 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.058968 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.558925984 +0000 UTC m=+228.927552317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.082323 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38704: no serving certificate available for the kubelet" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.163154 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.163788 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.663772284 +0000 UTC m=+229.032398607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.164338 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-s9tvm" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.264660 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.264832 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.764801777 +0000 UTC m=+229.133428130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.264929 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.265202 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.765191917 +0000 UTC m=+229.133818250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.366345 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.366884 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.866868798 +0000 UTC m=+229.235495121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.467951 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.468221 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:28.96821031 +0000 UTC m=+229.336836643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.566665 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-st67x" event={"ID":"06394469-defd-4710-90a2-b6c395c00d4f","Type":"ContainerStarted","Data":"0b346b108d62eb5da49953967efe8d2aa508da3a0686aee8e0074faddf6a0deb"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.566963 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-st67x" event={"ID":"06394469-defd-4710-90a2-b6c395c00d4f","Type":"ContainerStarted","Data":"c3c4ba69a1afa467197ea96769c8eddc625df688c9e556cde4a4df95630be0c7"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.567910 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.570776 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.571062 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.071048764 +0000 UTC m=+229.439675097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.571130 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.571154 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.582719 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" event={"ID":"48a0fa25-6b2d-4668-b8e5-824912077f19","Type":"ContainerStarted","Data":"902017b816fa06868e32faf2a15deecb13941d5338ff7a5ccdca3d1cef972aaa"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.584429 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" event={"ID":"6aa7a74a-b3d4-47df-a1b1-c83a5639e592","Type":"ContainerStarted","Data":"d02ba0167b9d3dcca2129478af6fb364197237d858ab6b10ee3c1903d568411b"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.585801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" event={"ID":"439a0925-eff6-4da1-b65c-de0623b9daaa","Type":"ContainerStarted","Data":"706bb3ae8d83a05d06d62a552dbf2b2212addee1d94ab04aa29717c7aa168687"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.585821 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" event={"ID":"439a0925-eff6-4da1-b65c-de0623b9daaa","Type":"ContainerStarted","Data":"02ddaa7c3b1f7c1e19e74cc27305fa6152d7ea38439bde76362b6a569f7807ab"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.587808 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" event={"ID":"f54babbf-5b93-4ef7-99d7-0cfe05e921c5","Type":"ContainerStarted","Data":"063c70ab5812212bb3956af2f3aadc7cbf5842847da96dcb5a138072de4412d4"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.591451 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-st67x" podStartSLOduration=157.591422731 podStartE2EDuration="2m37.591422731s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.588734626 +0000 UTC m=+228.957360969" watchObservedRunningTime="2026-03-12 18:06:28.591422731 +0000 UTC m=+228.960049064" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.616259 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" event={"ID":"7e94e6e0-b16d-462f-b791-ba20acdcb809","Type":"ContainerStarted","Data":"7b6f93f7edb6492e4695d79094d85d1f0d7cae2a51625b9c7258b1a05824775b"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.619092 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-snpzd" podStartSLOduration=157.619069291 podStartE2EDuration="2m37.619069291s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.617182249 +0000 UTC m=+228.985808582" watchObservedRunningTime="2026-03-12 18:06:28.619069291 +0000 UTC m=+228.987695634" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.635143 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" event={"ID":"5212edbb-b23e-44b8-aab1-77dece6d1415","Type":"ContainerStarted","Data":"3683e6d87ebfe0ba92a5a25a982d7182088dee633daa02366e090901ac184a34"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.635695 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.637364 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" event={"ID":"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba","Type":"ContainerStarted","Data":"c894dbfb8f955a86829df2ba4efcf318943892ea413eff807fd583ce708825e5"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.637383 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" event={"ID":"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba","Type":"ContainerStarted","Data":"03009e64aa09225c59aea596418228aebd555b8d4b2b3d4aac4ce13dbcb9ad9f"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.637392 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" event={"ID":"2e2b819b-3df8-4d51-b6d0-b30e33f1ceba","Type":"ContainerStarted","Data":"17b32e16b34462b3c5ef720559a7456983ee51be0128c62b7047131940a33bc0"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.639240 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" event={"ID":"0f14f5e4-148e-4282-9243-89c96906048a","Type":"ContainerStarted","Data":"50d50f4e8d969e7d550102332232f39ac77c2b02640011b745a39d32eeece517"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.639261 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" event={"ID":"0f14f5e4-148e-4282-9243-89c96906048a","Type":"ContainerStarted","Data":"b65b9aeced60e6b54a95ee1f5501d459c45e422db6fa17a55783725f478f175f"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.639889 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.643841 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" event={"ID":"00918e77-d6a1-4d1b-9986-37f3e648b322","Type":"ContainerStarted","Data":"a38ba83fac41229554dec49b5d2efd44825a6028f22c7a88f041d1ef43dea75f"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.643866 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" event={"ID":"00918e77-d6a1-4d1b-9986-37f3e648b322","Type":"ContainerStarted","Data":"efcb66960883ede90dc195e440865f7e82c0b5f56d998bbc51bade6dcf61a851"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.651863 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" event={"ID":"8acca805-2bf8-4dcb-a036-c4732084210e","Type":"ContainerStarted","Data":"8b631f999700f2f51aa43e2ed2f5b2edb1d58820b4c614d63e8892b10d5e4413"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.657466 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" event={"ID":"a931c621-236d-4a2f-9c96-2c29483f19db","Type":"ContainerStarted","Data":"df7538fc54525f099380b778ea1af02a79e3048b1fbc5d5925880e93ce1305aa"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.662045 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sbn5h" podStartSLOduration=157.662029177 podStartE2EDuration="2m37.662029177s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.642401681 +0000 UTC m=+229.011028014" watchObservedRunningTime="2026-03-12 18:06:28.662029177 +0000 UTC m=+229.030655510" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.672584 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.674073 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" event={"ID":"fb73641e-b156-470d-be1a-52c11f2efdf6","Type":"ContainerStarted","Data":"fa0d8501714fa5a206cbf7622a0a89cc8e4757cce153b86a3505d49984e2488f"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.674176 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" event={"ID":"fb73641e-b156-470d-be1a-52c11f2efdf6","Type":"ContainerStarted","Data":"3fcc7bbe3084bb0081beb292a638fe49741f558a7fe0817779cbbfcad3ca40dd"} Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.675706 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.175695228 +0000 UTC m=+229.544321561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.677066 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" event={"ID":"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51","Type":"ContainerStarted","Data":"9e9ac9fc823dfa71afcfab8c0327acbbbd68667ed466479a9ab182c4d3a10028"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.681329 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" podStartSLOduration=157.681315904 podStartE2EDuration="2m37.681315904s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.665037831 +0000 UTC m=+229.033664164" watchObservedRunningTime="2026-03-12 18:06:28.681315904 +0000 UTC m=+229.049942237" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.681735 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" podStartSLOduration=157.681730126 podStartE2EDuration="2m37.681730126s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.680345967 +0000 UTC m=+229.048972310" watchObservedRunningTime="2026-03-12 18:06:28.681730126 +0000 UTC m=+229.050356459" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.683542 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" event={"ID":"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b","Type":"ContainerStarted","Data":"c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.684335 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.686298 4926 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-c68kr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.686335 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.692189 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rxml7" event={"ID":"4c9eaabb-c0c2-4f71-82eb-7494d0a37075","Type":"ContainerStarted","Data":"cf0dcaa07c4b2300045723a181b0c00320c05ebaa5b17c3d50437dd6250eb5db"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.692226 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rxml7" event={"ID":"4c9eaabb-c0c2-4f71-82eb-7494d0a37075","Type":"ContainerStarted","Data":"ce252b86e505beb87d6ebc7320313de29a639b0e114586d0a2f5174a585af05e"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.693249 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" event={"ID":"5d073c88-4608-4594-9feb-f1093455368d","Type":"ContainerStarted","Data":"d77934aa0f03829fb16e2ee2875c4c7bb8bd67808e9108f3762f5211b24f4d70"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.694766 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" event={"ID":"d68160cf-4e6c-4294-bfdc-4acb74637ecb","Type":"ContainerStarted","Data":"71f803a395609858e2670b2b53c7204676afa8a634426e199a4c62b07d977042"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.696059 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" event={"ID":"416787ca-b62c-4fd5-84c5-57be59317faa","Type":"ContainerStarted","Data":"bc0c7e300e9f1ca23244b9b1e4f774aed1405acdaf5be46e0f9cca806457d750"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.696079 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" event={"ID":"416787ca-b62c-4fd5-84c5-57be59317faa","Type":"ContainerStarted","Data":"4ae398fd482d6448935b238177f877b46b8c26bd31ee1f2f7798f802d6d585f7"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.709048 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pd4lm" event={"ID":"c574438a-86a5-4a9f-aff3-47bf920cbabd","Type":"ContainerStarted","Data":"499c10c00a7a67b5c7de38ccfaf74c7acd1f77f0a75dbbc45898fba6fc3872b5"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.709086 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pd4lm" event={"ID":"c574438a-86a5-4a9f-aff3-47bf920cbabd","Type":"ContainerStarted","Data":"d6fb422db11ac7fcfc5af5930643d0c8e176ffffc4dd20b87f46927d4c33fcb1"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.713221 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" event={"ID":"36a0d781-e388-41ae-878e-05a69d81c83e","Type":"ContainerStarted","Data":"d7b30f818d802e3531db55adca1135dc83e00abca124fbeb2915f0f83c88fa84"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.717472 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" podStartSLOduration=157.71742457 podStartE2EDuration="2m37.71742457s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.717042449 +0000 UTC m=+229.085668792" watchObservedRunningTime="2026-03-12 18:06:28.71742457 +0000 UTC m=+229.086050903" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.728952 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" event={"ID":"721fd54d-55f5-4477-a7b8-283a700c8c47","Type":"ContainerStarted","Data":"c069c12da1b73cd3855cc706b05648787a3c2f323859e6419a8cdbb2cc4085fa"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.728985 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" event={"ID":"721fd54d-55f5-4477-a7b8-283a700c8c47","Type":"ContainerStarted","Data":"f632cc51bbf1817e9bf02cf814a40e910839cfb355f7afa5f22369074b6a5e51"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.736364 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bmp6d" podStartSLOduration=157.736348856 podStartE2EDuration="2m37.736348856s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.734874866 +0000 UTC m=+229.103501209" watchObservedRunningTime="2026-03-12 18:06:28.736348856 +0000 UTC m=+229.104975189" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.743382 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" event={"ID":"f4867106-501e-4408-af0b-7790cfc45a24","Type":"ContainerStarted","Data":"893f7549880003c3bed9600e5e303162fde1f9a3b368133025d5441193467a5c"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.743422 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" event={"ID":"f4867106-501e-4408-af0b-7790cfc45a24","Type":"ContainerStarted","Data":"fc6fe3ee88adf1a106f670dc0abbabe0e6215202e5d0a3834c479f5e9ad5dd1a"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.746362 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" event={"ID":"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98","Type":"ContainerStarted","Data":"239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.746895 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.748202 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" event={"ID":"f6ad9439-d12c-4987-840a-002975ba1498","Type":"ContainerStarted","Data":"c20e9eb36d7549aa43e55a83b069e75fd8e5138b013ae4e7628f337a9ab5601a"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.748230 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" event={"ID":"f6ad9439-d12c-4987-840a-002975ba1498","Type":"ContainerStarted","Data":"423ab1f7663178dcad7815397c58f06889c819795f4bfa80e43f7eec42b21289"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.748840 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.750788 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" event={"ID":"b99fa2a7-b47e-433c-b6d5-7c1306a79cde","Type":"ContainerStarted","Data":"7c2088451807c045409985d9ad0726d5bdce2c87ca611606a3f1d3b906f1416c"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.751461 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.755090 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" event={"ID":"4edd042a-f910-49c2-9220-56a6e79b04dc","Type":"ContainerStarted","Data":"2317c39f2953accfac56fb1efd908ec6062a7797866f162790fc901a5c9318d9"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.755122 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" event={"ID":"4edd042a-f910-49c2-9220-56a6e79b04dc","Type":"ContainerStarted","Data":"bf6d0c0b52a287d964339cdd9c78d27ee7b19ab7a4e80185b1b7ae31464e1f68"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.758183 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-drmq2" event={"ID":"a9fa6603-7fa6-4f96-9a3d-b6d3b0a0cd5b","Type":"ContainerStarted","Data":"1037edb6d6d02c15c85c19b1ac0f8d86103e850dccee7e65c579bbb97398254b"} Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.764248 4926 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rdjk9 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.764306 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" podUID="b99fa2a7-b47e-433c-b6d5-7c1306a79cde" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.764602 4926 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-x4kkc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.764619 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" podUID="f6ad9439-d12c-4987-840a-002975ba1498" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.764662 4926 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-q49l5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.764676 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" podUID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.777579 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.778395 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.278374756 +0000 UTC m=+229.647001099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.798814 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sgl9z" podStartSLOduration=157.798796115 podStartE2EDuration="2m37.798796115s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.76879537 +0000 UTC m=+229.137421703" watchObservedRunningTime="2026-03-12 18:06:28.798796115 +0000 UTC m=+229.167422438" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.812862 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" podStartSLOduration=157.812842426 podStartE2EDuration="2m37.812842426s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.799623999 +0000 UTC m=+229.168250332" watchObservedRunningTime="2026-03-12 18:06:28.812842426 +0000 UTC m=+229.181468749" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.829090 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b9s9h" podStartSLOduration=157.829067688 podStartE2EDuration="2m37.829067688s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.816327724 +0000 UTC m=+229.184954057" watchObservedRunningTime="2026-03-12 18:06:28.829067688 +0000 UTC m=+229.197694021" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.851632 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4g7g" podStartSLOduration=157.851607876 podStartE2EDuration="2m37.851607876s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.85103896 +0000 UTC m=+229.219665293" watchObservedRunningTime="2026-03-12 18:06:28.851607876 +0000 UTC m=+229.220234209" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.885805 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rx7tc" podStartSLOduration=157.885789118 podStartE2EDuration="2m37.885789118s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.885355805 +0000 UTC m=+229.253982148" watchObservedRunningTime="2026-03-12 18:06:28.885789118 +0000 UTC m=+229.254415451" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.886474 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.886861 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.386845726 +0000 UTC m=+229.755472059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.890873 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:28 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:28 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:28 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.891276 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.910425 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-sj8r7" podStartSLOduration=157.910400593 podStartE2EDuration="2m37.910400593s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.906945036 +0000 UTC m=+229.275571389" watchObservedRunningTime="2026-03-12 18:06:28.910400593 +0000 UTC m=+229.279026926" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.931886 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pd4lm" podStartSLOduration=6.93186608 podStartE2EDuration="6.93186608s" podCreationTimestamp="2026-03-12 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.928475116 +0000 UTC m=+229.297101469" watchObservedRunningTime="2026-03-12 18:06:28.93186608 +0000 UTC m=+229.300492413" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.971843 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h5mf8" podStartSLOduration=157.971827793 podStartE2EDuration="2m37.971827793s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.958746259 +0000 UTC m=+229.327372592" watchObservedRunningTime="2026-03-12 18:06:28.971827793 +0000 UTC m=+229.340454126" Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.988370 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.988538 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.488515528 +0000 UTC m=+229.857141871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.988718 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:28 crc kubenswrapper[4926]: E0312 18:06:28.989358 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.489330361 +0000 UTC m=+229.857956694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:28 crc kubenswrapper[4926]: I0312 18:06:28.995756 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-drmq2" podStartSLOduration=6.995727009 podStartE2EDuration="6.995727009s" podCreationTimestamp="2026-03-12 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:28.993149056 +0000 UTC m=+229.361775409" watchObservedRunningTime="2026-03-12 18:06:28.995727009 +0000 UTC m=+229.364353352" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.035216 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" podStartSLOduration=158.035193317 podStartE2EDuration="2m38.035193317s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.015569351 +0000 UTC m=+229.384195714" watchObservedRunningTime="2026-03-12 18:06:29.035193317 +0000 UTC m=+229.403819650" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.063096 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" podStartSLOduration=158.063067203 podStartE2EDuration="2m38.063067203s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.06185155 +0000 UTC m=+229.430477903" watchObservedRunningTime="2026-03-12 18:06:29.063067203 +0000 UTC m=+229.431693536" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.063671 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xxzt9" podStartSLOduration=158.06366691 podStartE2EDuration="2m38.06366691s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.038609602 +0000 UTC m=+229.407235945" watchObservedRunningTime="2026-03-12 18:06:29.06366691 +0000 UTC m=+229.432293243" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.082417 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" podStartSLOduration=158.082163645 podStartE2EDuration="2m38.082163645s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.080736685 +0000 UTC m=+229.449363018" watchObservedRunningTime="2026-03-12 18:06:29.082163645 +0000 UTC m=+229.450789968" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.090009 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.090416 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.590402185 +0000 UTC m=+229.959028518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.099323 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" podStartSLOduration=158.099305333 podStartE2EDuration="2m38.099305333s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.094354654 +0000 UTC m=+229.462980997" watchObservedRunningTime="2026-03-12 18:06:29.099305333 +0000 UTC m=+229.467931666" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.191615 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.191966 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.691955302 +0000 UTC m=+230.060581635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.215008 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.215062 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.217056 4926 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-9bzkt container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.217104 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" podUID="7e94e6e0-b16d-462f-b791-ba20acdcb809" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.292209 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.292512 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.792498252 +0000 UTC m=+230.161124585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.393643 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.394006 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.893994138 +0000 UTC m=+230.262620471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.410631 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38706: no serving certificate available for the kubelet" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.494661 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.494905 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.994874236 +0000 UTC m=+230.363500569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.495122 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.495646 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:29.995618027 +0000 UTC m=+230.364244360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.596835 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.597486 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.097424393 +0000 UTC m=+230.466050726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.597690 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.598359 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.098346908 +0000 UTC m=+230.466973391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.699700 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.699923 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.199890795 +0000 UTC m=+230.568517128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.700195 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.700617 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.200604865 +0000 UTC m=+230.569231198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.766174 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" event={"ID":"00918e77-d6a1-4d1b-9986-37f3e648b322","Type":"ContainerStarted","Data":"3e51472c7bd12f10625affaa4c1e75ad538e1e3179fc73e200eb25164934a4ab"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.770017 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" event={"ID":"8acca805-2bf8-4dcb-a036-c4732084210e","Type":"ContainerStarted","Data":"aaf60efe6696e6166729de2b984c3370705048c3fbc4edc6148242085d71fdf0"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.772774 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" event={"ID":"48a0fa25-6b2d-4668-b8e5-824912077f19","Type":"ContainerStarted","Data":"fc95ef35d9461ed11dbc47eae66311fbb2ff949ccc9eb5153beb2fcf366b41fe"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.778897 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" event={"ID":"f68a2b5d-036a-43ee-a9f2-dd94d9f51d51","Type":"ContainerStarted","Data":"de5090f0d7a82ccf34bb8bd7e922d4f50aec2f776b8c7b6207b9ff51c590847a"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.781401 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" event={"ID":"6aa7a74a-b3d4-47df-a1b1-c83a5639e592","Type":"ContainerStarted","Data":"15607ad279c98c97717385e271a19c69b978cd455519c54ef15b0977d9f167cf"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.781700 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.783536 4926 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-nhq52 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.783594 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" podUID="6aa7a74a-b3d4-47df-a1b1-c83a5639e592" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.788919 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rxml7" event={"ID":"4c9eaabb-c0c2-4f71-82eb-7494d0a37075","Type":"ContainerStarted","Data":"e00dd505fdeef5237730f913bf86e06154c5951c844ff5a324b340d0199cacc6"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.789029 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.791733 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dnk9m" podStartSLOduration=158.791717122 podStartE2EDuration="2m38.791717122s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.791190227 +0000 UTC m=+230.159816580" watchObservedRunningTime="2026-03-12 18:06:29.791717122 +0000 UTC m=+230.160343455" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.800228 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" event={"ID":"0f14f5e4-148e-4282-9243-89c96906048a","Type":"ContainerStarted","Data":"651b0bd421c4418ad38b69f4049bd85d537ba3d6a63234f7cb3bdda9af139e81"} Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.801149 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.801204 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.801149 4926 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-x4kkc container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.801284 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" podUID="f6ad9439-d12c-4987-840a-002975ba1498" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.801976 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" podUID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" containerName="controller-manager" containerID="cri-o://239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7" gracePeriod=30 Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.802760 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" podUID="4230f869-9456-44a1-87b3-342fc8c18ed7" containerName="route-controller-manager" containerID="cri-o://8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d" gracePeriod=30 Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.804050 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.804215 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.30419875 +0000 UTC m=+230.672825083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.804515 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.806062 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.306050861 +0000 UTC m=+230.674677194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.815856 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.817716 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rxml7" podStartSLOduration=7.817701056 podStartE2EDuration="7.817701056s" podCreationTimestamp="2026-03-12 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.816525572 +0000 UTC m=+230.185151935" watchObservedRunningTime="2026-03-12 18:06:29.817701056 +0000 UTC m=+230.186327389" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.865546 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:29 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:29 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:29 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.865606 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.878903 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdjk9" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.908348 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.909814 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.40978967 +0000 UTC m=+230.778415993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.909871 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:29 crc kubenswrapper[4926]: E0312 18:06:29.910963 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.410943532 +0000 UTC m=+230.779569865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.953961 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:06:29 crc kubenswrapper[4926]: I0312 18:06:29.959907 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" podStartSLOduration=158.959885384 podStartE2EDuration="2m38.959885384s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:29.954395322 +0000 UTC m=+230.323021655" watchObservedRunningTime="2026-03-12 18:06:29.959885384 +0000 UTC m=+230.328511717" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.030148 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.030760 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.530738738 +0000 UTC m=+230.899365071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.088001 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bsvxw" podStartSLOduration=159.087981141 podStartE2EDuration="2m39.087981141s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:30.086519481 +0000 UTC m=+230.455145824" watchObservedRunningTime="2026-03-12 18:06:30.087981141 +0000 UTC m=+230.456607464" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.089208 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" podStartSLOduration=159.089202055 podStartE2EDuration="2m39.089202055s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:30.045135639 +0000 UTC m=+230.413761962" watchObservedRunningTime="2026-03-12 18:06:30.089202055 +0000 UTC m=+230.457828388" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.132942 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.133409 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.633385835 +0000 UTC m=+231.002012328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.234255 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.234498 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.73446801 +0000 UTC m=+231.103094363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.234720 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.235112 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.735098658 +0000 UTC m=+231.103724991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.335841 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.336430 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.836417239 +0000 UTC m=+231.205043562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.438307 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.438783 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:30.938758338 +0000 UTC m=+231.307384681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.540001 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.540414 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.040399029 +0000 UTC m=+231.409025362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.582810 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.604028 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.640757 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-proxy-ca-bundles\") pod \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.640911 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-config\") pod \"4230f869-9456-44a1-87b3-342fc8c18ed7\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641022 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlg4x\" (UniqueName: \"kubernetes.io/projected/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-kube-api-access-nlg4x\") pod \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641107 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-client-ca\") pod \"4230f869-9456-44a1-87b3-342fc8c18ed7\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641196 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-client-ca\") pod \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641276 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-config\") pod \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641366 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4230f869-9456-44a1-87b3-342fc8c18ed7-serving-cert\") pod \"4230f869-9456-44a1-87b3-342fc8c18ed7\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641755 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-serving-cert\") pod \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\" (UID: \"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.641862 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28x4l\" (UniqueName: \"kubernetes.io/projected/4230f869-9456-44a1-87b3-342fc8c18ed7-kube-api-access-28x4l\") pod \"4230f869-9456-44a1-87b3-342fc8c18ed7\" (UID: \"4230f869-9456-44a1-87b3-342fc8c18ed7\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.642112 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.642402 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.142391028 +0000 UTC m=+231.511017361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.643335 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" (UID: "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.643645 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-config" (OuterVolumeSpecName: "config") pod "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" (UID: "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.644308 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-config" (OuterVolumeSpecName: "config") pod "4230f869-9456-44a1-87b3-342fc8c18ed7" (UID: "4230f869-9456-44a1-87b3-342fc8c18ed7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.645393 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-client-ca" (OuterVolumeSpecName: "client-ca") pod "4230f869-9456-44a1-87b3-342fc8c18ed7" (UID: "4230f869-9456-44a1-87b3-342fc8c18ed7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.653394 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-747cdc9dc9-hpk65"] Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.658036 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4230f869-9456-44a1-87b3-342fc8c18ed7" containerName="route-controller-manager" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.658075 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4230f869-9456-44a1-87b3-342fc8c18ed7" containerName="route-controller-manager" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.658094 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" containerName="controller-manager" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.658104 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" containerName="controller-manager" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.658233 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" containerName="controller-manager" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.658256 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4230f869-9456-44a1-87b3-342fc8c18ed7" containerName="route-controller-manager" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.658649 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" (UID: "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.658755 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.663676 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4230f869-9456-44a1-87b3-342fc8c18ed7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4230f869-9456-44a1-87b3-342fc8c18ed7" (UID: "4230f869-9456-44a1-87b3-342fc8c18ed7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.664891 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" (UID: "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.665060 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-kube-api-access-nlg4x" (OuterVolumeSpecName: "kube-api-access-nlg4x") pod "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" (UID: "ab2da4b6-fc75-4f97-b27c-3f687ddf9d98"). InnerVolumeSpecName "kube-api-access-nlg4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.699766 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4230f869-9456-44a1-87b3-342fc8c18ed7-kube-api-access-28x4l" (OuterVolumeSpecName: "kube-api-access-28x4l") pod "4230f869-9456-44a1-87b3-342fc8c18ed7" (UID: "4230f869-9456-44a1-87b3-342fc8c18ed7"). InnerVolumeSpecName "kube-api-access-28x4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.707394 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-747cdc9dc9-hpk65"] Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.743891 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744075 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-client-ca\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744103 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-serving-cert\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744132 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-proxy-ca-bundles\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744164 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-config\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744184 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbtx\" (UniqueName: \"kubernetes.io/projected/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-kube-api-access-5hbtx\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744248 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlg4x\" (UniqueName: \"kubernetes.io/projected/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-kube-api-access-nlg4x\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744259 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744269 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744277 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744285 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4230f869-9456-44a1-87b3-342fc8c18ed7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744295 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744303 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28x4l\" (UniqueName: \"kubernetes.io/projected/4230f869-9456-44a1-87b3-342fc8c18ed7-kube-api-access-28x4l\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744311 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.744319 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4230f869-9456-44a1-87b3-342fc8c18ed7-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.744391 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.244375738 +0000 UTC m=+231.613002061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.846420 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-proxy-ca-bundles\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.846510 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-config\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.846545 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hbtx\" (UniqueName: \"kubernetes.io/projected/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-kube-api-access-5hbtx\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.846577 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.847085 4926 generic.go:334] "Generic (PLEG): container finished" podID="4edd042a-f910-49c2-9220-56a6e79b04dc" containerID="2317c39f2953accfac56fb1efd908ec6062a7797866f162790fc901a5c9318d9" exitCode=0 Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.847148 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" event={"ID":"4edd042a-f910-49c2-9220-56a6e79b04dc","Type":"ContainerDied","Data":"2317c39f2953accfac56fb1efd908ec6062a7797866f162790fc901a5c9318d9"} Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.849064 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-proxy-ca-bundles\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.849136 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-config\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.849191 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-client-ca\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.849225 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-serving-cert\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.849429 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.349413322 +0000 UTC m=+231.718039745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.850338 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-client-ca\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.859588 4926 generic.go:334] "Generic (PLEG): container finished" podID="4230f869-9456-44a1-87b3-342fc8c18ed7" containerID="8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d" exitCode=0 Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.859652 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" event={"ID":"4230f869-9456-44a1-87b3-342fc8c18ed7","Type":"ContainerDied","Data":"8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d"} Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.859679 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" event={"ID":"4230f869-9456-44a1-87b3-342fc8c18ed7","Type":"ContainerDied","Data":"01692fb622a525e9c42adef3bc2dd9239f99fa47f15b51038978d9cd4978f5eb"} Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.859699 4926 scope.go:117] "RemoveContainer" containerID="8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.859856 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.866879 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-serving-cert\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.870070 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hbtx\" (UniqueName: \"kubernetes.io/projected/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-kube-api-access-5hbtx\") pod \"controller-manager-747cdc9dc9-hpk65\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.870275 4926 generic.go:334] "Generic (PLEG): container finished" podID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" containerID="239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7" exitCode=0 Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.870834 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.873815 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" event={"ID":"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98","Type":"ContainerDied","Data":"239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7"} Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.873894 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-q49l5" event={"ID":"ab2da4b6-fc75-4f97-b27c-3f687ddf9d98","Type":"ContainerDied","Data":"0e10ddf44702564bdb0735ff145d6f01e01df0a99bcf4c2caaa2097751e18b0f"} Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.876332 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.876370 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.877268 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:30 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:30 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:30 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.877313 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.896770 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x4kkc" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.912096 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h95sl" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.942651 4926 scope.go:117] "RemoveContainer" containerID="8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d" Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.943454 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d\": container with ID starting with 8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d not found: ID does not exist" containerID="8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.943488 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d"} err="failed to get container status \"8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d\": rpc error: code = NotFound desc = could not find container \"8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d\": container with ID starting with 8dee9c013e3c6386c5835b2c2d55dd9c7c4c750854cac3e4491b0748bba2145d not found: ID does not exist" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.943511 4926 scope.go:117] "RemoveContainer" containerID="239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.943870 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-nhq52" Mar 12 18:06:30 crc kubenswrapper[4926]: I0312 18:06:30.951062 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:30 crc kubenswrapper[4926]: E0312 18:06:30.952272 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.452245116 +0000 UTC m=+231.820871449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.006747 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.011318 4926 scope.go:117] "RemoveContainer" containerID="239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.012685 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7\": container with ID starting with 239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7 not found: ID does not exist" containerID="239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.012719 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7"} err="failed to get container status \"239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7\": rpc error: code = NotFound desc = could not find container \"239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7\": container with ID starting with 239dd029acb9c93acb5f73faccb1f785db98cc3574d192e8eef48910ed524ac7 not found: ID does not exist" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.018483 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-svrln"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.026243 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q49l5"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.031868 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.034687 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-q49l5"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.046655 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s4zf5"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.047611 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.055954 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.061818 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s4zf5"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.070980 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.071358 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.571342442 +0000 UTC m=+231.939968775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.149079 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-565fl"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.150062 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.158596 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.173896 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.174169 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-utilities\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.174242 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf7tf\" (UniqueName: \"kubernetes.io/projected/b2a609cd-c298-4356-9ddf-a7f125b52938-kube-api-access-wf7tf\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.174284 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-catalog-content\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.174392 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.674375661 +0000 UTC m=+232.043001994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.177883 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-565fl"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.282989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-catalog-content\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.283032 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf7tf\" (UniqueName: \"kubernetes.io/projected/b2a609cd-c298-4356-9ddf-a7f125b52938-kube-api-access-wf7tf\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.283054 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.283085 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxfn\" (UniqueName: \"kubernetes.io/projected/b5fe4032-6a1e-4c27-9471-fa53e044826e-kube-api-access-2wxfn\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.283106 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-catalog-content\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.283158 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-utilities\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.283207 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-utilities\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.283703 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.783692375 +0000 UTC m=+232.152318698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.284155 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-catalog-content\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.284354 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-utilities\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.335947 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf7tf\" (UniqueName: \"kubernetes.io/projected/b2a609cd-c298-4356-9ddf-a7f125b52938-kube-api-access-wf7tf\") pod \"community-operators-s4zf5\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.350866 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t4sbh"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.351865 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.367788 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t4sbh"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.385004 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.385199 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-utilities\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.385229 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-catalog-content\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.385259 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wxfn\" (UniqueName: \"kubernetes.io/projected/b5fe4032-6a1e-4c27-9471-fa53e044826e-kube-api-access-2wxfn\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.386208 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.886162238 +0000 UTC m=+232.254788571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.386327 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-utilities\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.386334 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-catalog-content\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.414234 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wxfn\" (UniqueName: \"kubernetes.io/projected/b5fe4032-6a1e-4c27-9471-fa53e044826e-kube-api-access-2wxfn\") pod \"certified-operators-565fl\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.416246 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.478360 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-747cdc9dc9-hpk65"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.486528 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjm4\" (UniqueName: \"kubernetes.io/projected/150781c8-5ae3-42a6-b351-2388dfe84167-kube-api-access-tnjm4\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.486590 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-utilities\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.486612 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-catalog-content\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.486640 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.486885 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:31.986872853 +0000 UTC m=+232.355499186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.493339 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.539810 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zpnq8"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.540877 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.557411 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpnq8"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.591804 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.592505 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjm4\" (UniqueName: \"kubernetes.io/projected/150781c8-5ae3-42a6-b351-2388dfe84167-kube-api-access-tnjm4\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.593109 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-utilities\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.593186 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-catalog-content\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.597542 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.097518603 +0000 UTC m=+232.466144936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.598031 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-utilities\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.599089 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-catalog-content\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.617408 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjm4\" (UniqueName: \"kubernetes.io/projected/150781c8-5ae3-42a6-b351-2388dfe84167-kube-api-access-tnjm4\") pod \"community-operators-t4sbh\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.685731 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.698083 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-utilities\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.698309 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.698492 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsdsv\" (UniqueName: \"kubernetes.io/projected/743a7318-33d0-4a59-93bd-7c6899554e5e-kube-api-access-fsdsv\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.698713 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-catalog-content\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.698850 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.198831644 +0000 UTC m=+232.567458057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.750483 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s4zf5"] Mar 12 18:06:31 crc kubenswrapper[4926]: W0312 18:06:31.799118 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a609cd_c298_4356_9ddf_a7f125b52938.slice/crio-6da98836531143c0f0c9e5b0f54f53b5a8c52f32a98b1d55d2cd5e57c0ec7c9e WatchSource:0}: Error finding container 6da98836531143c0f0c9e5b0f54f53b5a8c52f32a98b1d55d2cd5e57c0ec7c9e: Status 404 returned error can't find the container with id 6da98836531143c0f0c9e5b0f54f53b5a8c52f32a98b1d55d2cd5e57c0ec7c9e Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.807796 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.808100 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-catalog-content\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.808235 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-utilities\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.808318 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsdsv\" (UniqueName: \"kubernetes.io/projected/743a7318-33d0-4a59-93bd-7c6899554e5e-kube-api-access-fsdsv\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.808671 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.308656082 +0000 UTC m=+232.677282415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.809243 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-utilities\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.809253 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-catalog-content\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.842633 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsdsv\" (UniqueName: \"kubernetes.io/projected/743a7318-33d0-4a59-93bd-7c6899554e5e-kube-api-access-fsdsv\") pod \"certified-operators-zpnq8\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.846130 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.846848 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.854894 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.855059 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.858025 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.880698 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:31 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:31 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:31 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.880961 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.880721 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.910113 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:31 crc kubenswrapper[4926]: E0312 18:06:31.910464 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.410452116 +0000 UTC m=+232.779078449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.959387 4926 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.970243 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4zf5" event={"ID":"b2a609cd-c298-4356-9ddf-a7f125b52938","Type":"ContainerStarted","Data":"6da98836531143c0f0c9e5b0f54f53b5a8c52f32a98b1d55d2cd5e57c0ec7c9e"} Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.979535 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" event={"ID":"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a","Type":"ContainerStarted","Data":"5c2190b7dcff0a0981911cc6e9cc8e221d3f909aececbc7028daeebdaaddefca"} Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.979574 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" event={"ID":"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a","Type":"ContainerStarted","Data":"9b97c5c1e3245e013da7b8a53533ab149e2f2a6606fb877d381e7f0febac2c97"} Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.980071 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.986156 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" event={"ID":"8acca805-2bf8-4dcb-a036-c4732084210e","Type":"ContainerStarted","Data":"86d6b8eb4e0c5d613bb37d960f1cb609607647ef4a83a5fee4e0dc3cd1316553"} Mar 12 18:06:31 crc kubenswrapper[4926]: I0312 18:06:31.986198 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" event={"ID":"8acca805-2bf8-4dcb-a036-c4732084210e","Type":"ContainerStarted","Data":"945ca6bf80b6220cef8ef56354c13d5eb325de432c72b426448c9273ad83e0c1"} Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.003073 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-565fl"] Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.004670 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" podStartSLOduration=4.00464958 podStartE2EDuration="4.00464958s" podCreationTimestamp="2026-03-12 18:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:32.002521811 +0000 UTC m=+232.371148144" watchObservedRunningTime="2026-03-12 18:06:32.00464958 +0000 UTC m=+232.373275913" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.008482 4926 patch_prober.go:28] interesting pod/controller-manager-747cdc9dc9-hpk65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.008536 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.012095 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.012528 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.012588 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.012754 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.512736855 +0000 UTC m=+232.881363188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.013338 4926 ???:1] "http: TLS handshake error from 192.168.126.11:38708: no serving certificate available for the kubelet" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.114065 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.114106 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.114196 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.118555 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.61853767 +0000 UTC m=+232.987164003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.119190 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.151831 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.199789 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.215766 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.216148 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.716133078 +0000 UTC m=+233.084759411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.313253 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t4sbh"] Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.317148 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.317529 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.817508121 +0000 UTC m=+233.186134514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.333258 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.420197 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.420394 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.920357544 +0000 UTC m=+233.288983877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.420644 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4edd042a-f910-49c2-9220-56a6e79b04dc-secret-volume\") pod \"4edd042a-f910-49c2-9220-56a6e79b04dc\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.420725 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4edd042a-f910-49c2-9220-56a6e79b04dc-config-volume\") pod \"4edd042a-f910-49c2-9220-56a6e79b04dc\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.420745 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq9lz\" (UniqueName: \"kubernetes.io/projected/4edd042a-f910-49c2-9220-56a6e79b04dc-kube-api-access-hq9lz\") pod \"4edd042a-f910-49c2-9220-56a6e79b04dc\" (UID: \"4edd042a-f910-49c2-9220-56a6e79b04dc\") " Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.420925 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.421302 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:32.92128586 +0000 UTC m=+233.289912193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.422270 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edd042a-f910-49c2-9220-56a6e79b04dc-config-volume" (OuterVolumeSpecName: "config-volume") pod "4edd042a-f910-49c2-9220-56a6e79b04dc" (UID: "4edd042a-f910-49c2-9220-56a6e79b04dc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.432606 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpnq8"] Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.433344 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4edd042a-f910-49c2-9220-56a6e79b04dc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4edd042a-f910-49c2-9220-56a6e79b04dc" (UID: "4edd042a-f910-49c2-9220-56a6e79b04dc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.443296 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edd042a-f910-49c2-9220-56a6e79b04dc-kube-api-access-hq9lz" (OuterVolumeSpecName: "kube-api-access-hq9lz") pod "4edd042a-f910-49c2-9220-56a6e79b04dc" (UID: "4edd042a-f910-49c2-9220-56a6e79b04dc"). InnerVolumeSpecName "kube-api-access-hq9lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.520603 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4230f869-9456-44a1-87b3-342fc8c18ed7" path="/var/lib/kubelet/pods/4230f869-9456-44a1-87b3-342fc8c18ed7/volumes" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.521298 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab2da4b6-fc75-4f97-b27c-3f687ddf9d98" path="/var/lib/kubelet/pods/ab2da4b6-fc75-4f97-b27c-3f687ddf9d98/volumes" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.522382 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.522833 4926 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4edd042a-f910-49c2-9220-56a6e79b04dc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.522854 4926 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4edd042a-f910-49c2-9220-56a6e79b04dc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.522866 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq9lz\" (UniqueName: \"kubernetes.io/projected/4edd042a-f910-49c2-9220-56a6e79b04dc-kube-api-access-hq9lz\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.522932 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:33.02291541 +0000 UTC m=+233.391541743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.523332 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 18:06:32 crc kubenswrapper[4926]: W0312 18:06:32.573428 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0156cb9a_b076_4cc2_9a14_68e06a5bbed4.slice/crio-c488202eea99f9ecf2c2a185d13fc76aef801aa5dd8c49cf087101de5890d5a4 WatchSource:0}: Error finding container c488202eea99f9ecf2c2a185d13fc76aef801aa5dd8c49cf087101de5890d5a4: Status 404 returned error can't find the container with id c488202eea99f9ecf2c2a185d13fc76aef801aa5dd8c49cf087101de5890d5a4 Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.624008 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.624398 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:33.124385815 +0000 UTC m=+233.493012148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.707586 4926 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-12T18:06:31.95942032Z","Handler":null,"Name":""} Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.725495 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.725784 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 18:06:33.225753638 +0000 UTC m=+233.594379971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.726721 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.727299 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 18:06:33.227276551 +0000 UTC m=+233.595902894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g6fzt" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.736315 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs"] Mar 12 18:06:32 crc kubenswrapper[4926]: E0312 18:06:32.736680 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edd042a-f910-49c2-9220-56a6e79b04dc" containerName="collect-profiles" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.736700 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edd042a-f910-49c2-9220-56a6e79b04dc" containerName="collect-profiles" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.736830 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edd042a-f910-49c2-9220-56a6e79b04dc" containerName="collect-profiles" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.737312 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.743822 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.744078 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.744713 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.744910 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.745052 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.745200 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.765452 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs"] Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.782345 4926 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.782383 4926 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.807562 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.808278 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.821202 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.822017 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.824400 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.827284 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.827581 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6dx\" (UniqueName: \"kubernetes.io/projected/7e957b87-36f8-433f-9547-70bf22354018-kube-api-access-xb6dx\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.827629 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-config\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.827657 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-client-ca\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.827703 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e957b87-36f8-433f-9547-70bf22354018-serving-cert\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.871391 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.876927 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:32 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:32 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:32 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.876982 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.929845 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d00c7182-7f78-4452-a377-670193a7366b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.929906 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d00c7182-7f78-4452-a377-670193a7366b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.930051 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e957b87-36f8-433f-9547-70bf22354018-serving-cert\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.930285 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6dx\" (UniqueName: \"kubernetes.io/projected/7e957b87-36f8-433f-9547-70bf22354018-kube-api-access-xb6dx\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.930369 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.930397 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-config\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.930475 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-client-ca\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.931885 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-client-ca\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.933046 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-config\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.937470 4926 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.937527 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.938588 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e957b87-36f8-433f-9547-70bf22354018-serving-cert\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.955261 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6dx\" (UniqueName: \"kubernetes.io/projected/7e957b87-36f8-433f-9547-70bf22354018-kube-api-access-xb6dx\") pod \"route-controller-manager-758549c5dd-zclbs\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.995361 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0156cb9a-b076-4cc2-9a14-68e06a5bbed4","Type":"ContainerStarted","Data":"c488202eea99f9ecf2c2a185d13fc76aef801aa5dd8c49cf087101de5890d5a4"} Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.998164 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" event={"ID":"4edd042a-f910-49c2-9220-56a6e79b04dc","Type":"ContainerDied","Data":"bf6d0c0b52a287d964339cdd9c78d27ee7b19ab7a4e80185b1b7ae31464e1f68"} Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.998189 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6d0c0b52a287d964339cdd9c78d27ee7b19ab7a4e80185b1b7ae31464e1f68" Mar 12 18:06:32 crc kubenswrapper[4926]: I0312 18:06:32.998271 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555640-5965f" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.005309 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g6fzt\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.005381 4926 generic.go:334] "Generic (PLEG): container finished" podID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerID="fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8" exitCode=0 Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.005425 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-565fl" event={"ID":"b5fe4032-6a1e-4c27-9471-fa53e044826e","Type":"ContainerDied","Data":"fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.005456 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-565fl" event={"ID":"b5fe4032-6a1e-4c27-9471-fa53e044826e","Type":"ContainerStarted","Data":"2ea4f4bc9037632e2b998da8312ddbb98d227eabc09259bd968b95eab6d3b562"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.016589 4926 generic.go:334] "Generic (PLEG): container finished" podID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerID="12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447" exitCode=0 Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.016999 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4zf5" event={"ID":"b2a609cd-c298-4356-9ddf-a7f125b52938","Type":"ContainerDied","Data":"12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.022217 4926 generic.go:334] "Generic (PLEG): container finished" podID="150781c8-5ae3-42a6-b351-2388dfe84167" containerID="2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139" exitCode=0 Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.022291 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerDied","Data":"2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.022319 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerStarted","Data":"f0734678dc103280ac23f9e4a0964bfd2dcfda20bb5ae8365cb587d30b175b62"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.031755 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d00c7182-7f78-4452-a377-670193a7366b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.031814 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d00c7182-7f78-4452-a377-670193a7366b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.031929 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d00c7182-7f78-4452-a377-670193a7366b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.037882 4926 generic.go:334] "Generic (PLEG): container finished" podID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerID="80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9" exitCode=0 Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.038714 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpnq8" event={"ID":"743a7318-33d0-4a59-93bd-7c6899554e5e","Type":"ContainerDied","Data":"80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.038766 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpnq8" event={"ID":"743a7318-33d0-4a59-93bd-7c6899554e5e","Type":"ContainerStarted","Data":"dc92306c9f92008ccb5e4b67f4e37a7020d96c316ccb9bc2ecf40d9ac23d9cf2"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.061590 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d00c7182-7f78-4452-a377-670193a7366b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.062562 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" event={"ID":"8acca805-2bf8-4dcb-a036-c4732084210e","Type":"ContainerStarted","Data":"72c3731c75f68b012ce99f7bd7ad25d84a188d56b928b2ed7e8919fba03a328b"} Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.065640 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.070142 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.125106 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-c9xl5" podStartSLOduration=11.125089108 podStartE2EDuration="11.125089108s" podCreationTimestamp="2026-03-12 18:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:33.120807018 +0000 UTC m=+233.489433351" watchObservedRunningTime="2026-03-12 18:06:33.125089108 +0000 UTC m=+233.493715441" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.154236 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wz6qt"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.155811 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.155941 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.161179 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.194942 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz6qt"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.200762 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.247245 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-utilities\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.247871 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-catalog-content\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.248045 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwlr2\" (UniqueName: \"kubernetes.io/projected/702daa2d-851e-4c3d-be86-4f337b4462f7-kube-api-access-nwlr2\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.364552 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-utilities\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.364645 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-catalog-content\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.364709 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwlr2\" (UniqueName: \"kubernetes.io/projected/702daa2d-851e-4c3d-be86-4f337b4462f7-kube-api-access-nwlr2\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.365718 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-catalog-content\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.365919 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-utilities\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.391228 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwlr2\" (UniqueName: \"kubernetes.io/projected/702daa2d-851e-4c3d-be86-4f337b4462f7-kube-api-access-nwlr2\") pod \"redhat-marketplace-wz6qt\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.553856 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.564080 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v5qw8"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.565365 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.587561 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v5qw8"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.676271 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-catalog-content\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.676327 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-utilities\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.676381 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkgrh\" (UniqueName: \"kubernetes.io/projected/8630848f-c268-4f4a-9fd0-8f33765c20b4-kube-api-access-pkgrh\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.688058 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.743763 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.762209 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6fzt"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.777900 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-catalog-content\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.777951 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-utilities\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.777998 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkgrh\" (UniqueName: \"kubernetes.io/projected/8630848f-c268-4f4a-9fd0-8f33765c20b4-kube-api-access-pkgrh\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.778649 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-catalog-content\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.778851 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-utilities\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.814967 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkgrh\" (UniqueName: \"kubernetes.io/projected/8630848f-c268-4f4a-9fd0-8f33765c20b4-kube-api-access-pkgrh\") pod \"redhat-marketplace-v5qw8\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.863327 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:33 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:33 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:33 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.863717 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.903986 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.949689 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz6qt"] Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.999284 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:33 crc kubenswrapper[4926]: I0312 18:06:33.999343 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.005626 4926 patch_prober.go:28] interesting pod/console-f9d7485db-vb9qx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.005764 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vb9qx" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 12 18:06:34 crc kubenswrapper[4926]: W0312 18:06:34.009708 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod702daa2d_851e_4c3d_be86_4f337b4462f7.slice/crio-1ca7001bbffd391adca6fddca7abaa8f8e5f4a6dec4a426455ef61c2d31fac2d WatchSource:0}: Error finding container 1ca7001bbffd391adca6fddca7abaa8f8e5f4a6dec4a426455ef61c2d31fac2d: Status 404 returned error can't find the container with id 1ca7001bbffd391adca6fddca7abaa8f8e5f4a6dec4a426455ef61c2d31fac2d Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.079600 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" event={"ID":"7e957b87-36f8-433f-9547-70bf22354018","Type":"ContainerStarted","Data":"9f487eeed2150c2eeef5da93b9c67da82e7c33df3d94484e998aa4c72eec6d5a"} Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.081638 4926 generic.go:334] "Generic (PLEG): container finished" podID="0156cb9a-b076-4cc2-9a14-68e06a5bbed4" containerID="a9d8fd89ceed81099f6030ffc1f3c040d3c1980ae97507aa4cd5759f3a0ff3d1" exitCode=0 Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.081692 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0156cb9a-b076-4cc2-9a14-68e06a5bbed4","Type":"ContainerDied","Data":"a9d8fd89ceed81099f6030ffc1f3c040d3c1980ae97507aa4cd5759f3a0ff3d1"} Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.083936 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d00c7182-7f78-4452-a377-670193a7366b","Type":"ContainerStarted","Data":"bca04b15f8cd7489625ca0ac157b17db2f261392d7437acf647da36fff99051b"} Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.109541 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" event={"ID":"97b0faa2-bcb2-417e-9065-3156860a8644","Type":"ContainerStarted","Data":"5fd53c65057b16b866539f523a9ef035c2f73d7900692a8369fc22f3d2f8f197"} Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.111009 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz6qt" event={"ID":"702daa2d-851e-4c3d-be86-4f337b4462f7","Type":"ContainerStarted","Data":"1ca7001bbffd391adca6fddca7abaa8f8e5f4a6dec4a426455ef61c2d31fac2d"} Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.128768 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.128803 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.144630 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.151861 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zx9h7"] Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.152852 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.156605 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.186054 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx9h7"] Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.223672 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.233992 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9bzkt" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.294892 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-catalog-content\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.295086 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57ncv\" (UniqueName: \"kubernetes.io/projected/637236a6-6287-401d-a2cd-78713aa03176-kube-api-access-57ncv\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.295662 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-utilities\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.397288 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.397379 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57ncv\" (UniqueName: \"kubernetes.io/projected/637236a6-6287-401d-a2cd-78713aa03176-kube-api-access-57ncv\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.397413 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.397476 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-utilities\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.397534 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-catalog-content\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.398002 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-catalog-content\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.399450 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-utilities\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.409405 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.421189 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.448482 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v5qw8"] Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.451281 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57ncv\" (UniqueName: \"kubernetes.io/projected/637236a6-6287-401d-a2cd-78713aa03176-kube-api-access-57ncv\") pod \"redhat-operators-zx9h7\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.503851 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.504120 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.504182 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.510290 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.513776 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.522879 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.538992 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.548065 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.562950 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmn6x"] Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.563991 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.587515 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6x"] Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.720411 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-catalog-content\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.720773 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4msz\" (UniqueName: \"kubernetes.io/projected/1f425571-9ce5-4fdc-9631-7683efa292aa-kube-api-access-v4msz\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.720802 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-utilities\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.807166 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.822377 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-catalog-content\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.822427 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4msz\" (UniqueName: \"kubernetes.io/projected/1f425571-9ce5-4fdc-9631-7683efa292aa-kube-api-access-v4msz\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.822467 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-utilities\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.822890 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-utilities\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.822938 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-catalog-content\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.862249 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4msz\" (UniqueName: \"kubernetes.io/projected/1f425571-9ce5-4fdc-9631-7683efa292aa-kube-api-access-v4msz\") pod \"redhat-operators-hmn6x\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.865551 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.880643 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:34 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:34 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:34 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.880695 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:34 crc kubenswrapper[4926]: I0312 18:06:34.896559 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.133895 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" event={"ID":"7e957b87-36f8-433f-9547-70bf22354018","Type":"ContainerStarted","Data":"404fc2dbd169e6a506a47a16f7ea299950a21ae0b3144c3fe7d945d26a749973"} Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.134392 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.139710 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.148077 4926 generic.go:334] "Generic (PLEG): container finished" podID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerID="ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3" exitCode=0 Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.148145 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v5qw8" event={"ID":"8630848f-c268-4f4a-9fd0-8f33765c20b4","Type":"ContainerDied","Data":"ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3"} Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.148165 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v5qw8" event={"ID":"8630848f-c268-4f4a-9fd0-8f33765c20b4","Type":"ContainerStarted","Data":"e7bb37ba860cfb3ecfc47e3421fdc163d8b676f30a5d56c59a6b61ad57e2cd0c"} Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.152533 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" podStartSLOduration=7.152521569 podStartE2EDuration="7.152521569s" podCreationTimestamp="2026-03-12 18:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:35.147420997 +0000 UTC m=+235.516047330" watchObservedRunningTime="2026-03-12 18:06:35.152521569 +0000 UTC m=+235.521147902" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.159693 4926 generic.go:334] "Generic (PLEG): container finished" podID="d00c7182-7f78-4452-a377-670193a7366b" containerID="01d6a31eb39da97367417192915f91b07b207f5bf4fbb0c3e97c4026c48a7c65" exitCode=0 Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.159751 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d00c7182-7f78-4452-a377-670193a7366b","Type":"ContainerDied","Data":"01d6a31eb39da97367417192915f91b07b207f5bf4fbb0c3e97c4026c48a7c65"} Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.174921 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" event={"ID":"97b0faa2-bcb2-417e-9065-3156860a8644","Type":"ContainerStarted","Data":"f7c989898f3b81f561b1ebddb62bcb7b02ee6fac8bd401ab3a5ff75b154fda11"} Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.175227 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.188263 4926 generic.go:334] "Generic (PLEG): container finished" podID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerID="7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa" exitCode=0 Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.189896 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz6qt" event={"ID":"702daa2d-851e-4c3d-be86-4f337b4462f7","Type":"ContainerDied","Data":"7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa"} Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.193962 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-z95pp" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.230407 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" podStartSLOduration=164.230390148 podStartE2EDuration="2m44.230390148s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:06:35.229234735 +0000 UTC m=+235.597861068" watchObservedRunningTime="2026-03-12 18:06:35.230390148 +0000 UTC m=+235.599016481" Mar 12 18:06:35 crc kubenswrapper[4926]: W0312 18:06:35.435370 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f741f51ff41e6e41540c659345accd524bff2e736d76e15c0688827a3209d685 WatchSource:0}: Error finding container f741f51ff41e6e41540c659345accd524bff2e736d76e15c0688827a3209d685: Status 404 returned error can't find the container with id f741f51ff41e6e41540c659345accd524bff2e736d76e15c0688827a3209d685 Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.503364 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6x"] Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.605762 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx9h7"] Mar 12 18:06:35 crc kubenswrapper[4926]: W0312 18:06:35.616383 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod637236a6_6287_401d_a2cd_78713aa03176.slice/crio-b41dbdfd44a141e983428f384fe92d330459deb4bd26d614487e897aef5edb00 WatchSource:0}: Error finding container b41dbdfd44a141e983428f384fe92d330459deb4bd26d614487e897aef5edb00: Status 404 returned error can't find the container with id b41dbdfd44a141e983428f384fe92d330459deb4bd26d614487e897aef5edb00 Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.671923 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.671960 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.672064 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.672111 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.706261 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.854726 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kubelet-dir\") pod \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.855111 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0156cb9a-b076-4cc2-9a14-68e06a5bbed4" (UID: "0156cb9a-b076-4cc2-9a14-68e06a5bbed4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.855563 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kube-api-access\") pod \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\" (UID: \"0156cb9a-b076-4cc2-9a14-68e06a5bbed4\") " Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.856125 4926 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.865148 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:35 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:35 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:35 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.865225 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.891848 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0156cb9a-b076-4cc2-9a14-68e06a5bbed4" (UID: "0156cb9a-b076-4cc2-9a14-68e06a5bbed4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:35 crc kubenswrapper[4926]: I0312 18:06:35.957456 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0156cb9a-b076-4cc2-9a14-68e06a5bbed4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.224896 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"74230acbcb2ff2532edb6856289695dc59296f47c7f8b82ad1bac248d06f52e3"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.224939 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1f4b5d0934532a9a6ba9d33f6a6ad066d61dd50d39fc0e2594fed50d2467c7d8"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.225132 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.230273 4926 generic.go:334] "Generic (PLEG): container finished" podID="637236a6-6287-401d-a2cd-78713aa03176" containerID="76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1" exitCode=0 Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.230358 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx9h7" event={"ID":"637236a6-6287-401d-a2cd-78713aa03176","Type":"ContainerDied","Data":"76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.230424 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx9h7" event={"ID":"637236a6-6287-401d-a2cd-78713aa03176","Type":"ContainerStarted","Data":"b41dbdfd44a141e983428f384fe92d330459deb4bd26d614487e897aef5edb00"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.242259 4926 generic.go:334] "Generic (PLEG): container finished" podID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerID="3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a" exitCode=0 Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.242644 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerDied","Data":"3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.242731 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerStarted","Data":"d295389691829deab9da8e29d0c26dfd809e13bacff8864270d0d1a8c5b14dcd"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.253425 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.246958 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bc5d581fe3ff1e668de4215b5f77628054188baca4bf123fd67c0191c436b563"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.254049 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fa1688dc92d10fddca52e3788554a0fe6becc03f8ec0c58b1997076956d0a899"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.254070 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1447bf24555e677d095d00ea7874b121bd3cb802eddb13ee5c1823a91ae9712c"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.254084 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f741f51ff41e6e41540c659345accd524bff2e736d76e15c0688827a3209d685"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.254097 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0156cb9a-b076-4cc2-9a14-68e06a5bbed4","Type":"ContainerDied","Data":"c488202eea99f9ecf2c2a185d13fc76aef801aa5dd8c49cf087101de5890d5a4"} Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.254116 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c488202eea99f9ecf2c2a185d13fc76aef801aa5dd8c49cf087101de5890d5a4" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.700596 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.773121 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d00c7182-7f78-4452-a377-670193a7366b-kubelet-dir\") pod \"d00c7182-7f78-4452-a377-670193a7366b\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.773225 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d00c7182-7f78-4452-a377-670193a7366b-kube-api-access\") pod \"d00c7182-7f78-4452-a377-670193a7366b\" (UID: \"d00c7182-7f78-4452-a377-670193a7366b\") " Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.773511 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d00c7182-7f78-4452-a377-670193a7366b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d00c7182-7f78-4452-a377-670193a7366b" (UID: "d00c7182-7f78-4452-a377-670193a7366b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.780480 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00c7182-7f78-4452-a377-670193a7366b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d00c7182-7f78-4452-a377-670193a7366b" (UID: "d00c7182-7f78-4452-a377-670193a7366b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.875208 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d00c7182-7f78-4452-a377-670193a7366b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.875246 4926 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d00c7182-7f78-4452-a377-670193a7366b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.878346 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:36 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:36 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:36 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:36 crc kubenswrapper[4926]: I0312 18:06:36.878402 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:37 crc kubenswrapper[4926]: I0312 18:06:37.188145 4926 ???:1] "http: TLS handshake error from 192.168.126.11:50054: no serving certificate available for the kubelet" Mar 12 18:06:37 crc kubenswrapper[4926]: I0312 18:06:37.316661 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d00c7182-7f78-4452-a377-670193a7366b","Type":"ContainerDied","Data":"bca04b15f8cd7489625ca0ac157b17db2f261392d7437acf647da36fff99051b"} Mar 12 18:06:37 crc kubenswrapper[4926]: I0312 18:06:37.316742 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca04b15f8cd7489625ca0ac157b17db2f261392d7437acf647da36fff99051b" Mar 12 18:06:37 crc kubenswrapper[4926]: I0312 18:06:37.316787 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 18:06:37 crc kubenswrapper[4926]: I0312 18:06:37.865778 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:37 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:37 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:37 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:37 crc kubenswrapper[4926]: I0312 18:06:37.865840 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:38 crc kubenswrapper[4926]: I0312 18:06:38.865377 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:38 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:38 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:38 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:38 crc kubenswrapper[4926]: I0312 18:06:38.865467 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:39 crc kubenswrapper[4926]: I0312 18:06:39.864810 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:39 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:39 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:39 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:39 crc kubenswrapper[4926]: I0312 18:06:39.865154 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:40 crc kubenswrapper[4926]: I0312 18:06:40.220909 4926 ???:1] "http: TLS handshake error from 192.168.126.11:50064: no serving certificate available for the kubelet" Mar 12 18:06:40 crc kubenswrapper[4926]: I0312 18:06:40.594709 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rxml7" Mar 12 18:06:40 crc kubenswrapper[4926]: I0312 18:06:40.871320 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:40 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:40 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:40 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:40 crc kubenswrapper[4926]: I0312 18:06:40.871383 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:41 crc kubenswrapper[4926]: I0312 18:06:41.862295 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:41 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:41 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:41 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:41 crc kubenswrapper[4926]: I0312 18:06:41.862377 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:42 crc kubenswrapper[4926]: I0312 18:06:42.862855 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:42 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:42 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:42 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:42 crc kubenswrapper[4926]: I0312 18:06:42.862915 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:43 crc kubenswrapper[4926]: I0312 18:06:43.862781 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:43 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:43 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:43 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:43 crc kubenswrapper[4926]: I0312 18:06:43.863044 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:43 crc kubenswrapper[4926]: I0312 18:06:43.999648 4926 patch_prober.go:28] interesting pod/console-f9d7485db-vb9qx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 12 18:06:43 crc kubenswrapper[4926]: I0312 18:06:43.999842 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-vb9qx" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 12 18:06:44 crc kubenswrapper[4926]: I0312 18:06:44.801088 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:44 crc kubenswrapper[4926]: I0312 18:06:44.803065 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 18:06:44 crc kubenswrapper[4926]: I0312 18:06:44.820277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/211eeae6-9b41-484b-bd13-99c1c28cdf96-metrics-certs\") pod \"network-metrics-daemon-n7pd7\" (UID: \"211eeae6-9b41-484b-bd13-99c1c28cdf96\") " pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:44 crc kubenswrapper[4926]: I0312 18:06:44.862497 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:44 crc kubenswrapper[4926]: [-]has-synced failed: reason withheld Mar 12 18:06:44 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:44 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:44 crc kubenswrapper[4926]: I0312 18:06:44.862547 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.015997 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.025158 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n7pd7" Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.671719 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.671780 4926 patch_prober.go:28] interesting pod/downloads-7954f5f757-st67x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.671831 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.671779 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-st67x" podUID="06394469-defd-4710-90a2-b6c395c00d4f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.863018 4926 patch_prober.go:28] interesting pod/router-default-5444994796-wndvq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 18:06:45 crc kubenswrapper[4926]: [+]has-synced ok Mar 12 18:06:45 crc kubenswrapper[4926]: [+]process-running ok Mar 12 18:06:45 crc kubenswrapper[4926]: healthz check failed Mar 12 18:06:45 crc kubenswrapper[4926]: I0312 18:06:45.863101 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wndvq" podUID="e379fe1d-7780-4f17-8df8-f74f3dddbc23" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 18:06:46 crc kubenswrapper[4926]: I0312 18:06:46.863456 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:46 crc kubenswrapper[4926]: I0312 18:06:46.866604 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wndvq" Mar 12 18:06:47 crc kubenswrapper[4926]: I0312 18:06:47.135597 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-747cdc9dc9-hpk65"] Mar 12 18:06:47 crc kubenswrapper[4926]: I0312 18:06:47.136230 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerName="controller-manager" containerID="cri-o://5c2190b7dcff0a0981911cc6e9cc8e221d3f909aececbc7028daeebdaaddefca" gracePeriod=30 Mar 12 18:06:47 crc kubenswrapper[4926]: I0312 18:06:47.143361 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs"] Mar 12 18:06:47 crc kubenswrapper[4926]: I0312 18:06:47.143589 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" podUID="7e957b87-36f8-433f-9547-70bf22354018" containerName="route-controller-manager" containerID="cri-o://404fc2dbd169e6a506a47a16f7ea299950a21ae0b3144c3fe7d945d26a749973" gracePeriod=30 Mar 12 18:06:47 crc kubenswrapper[4926]: I0312 18:06:47.462471 4926 ???:1] "http: TLS handshake error from 192.168.126.11:44194: no serving certificate available for the kubelet" Mar 12 18:06:48 crc kubenswrapper[4926]: I0312 18:06:48.465197 4926 generic.go:334] "Generic (PLEG): container finished" podID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerID="5c2190b7dcff0a0981911cc6e9cc8e221d3f909aececbc7028daeebdaaddefca" exitCode=0 Mar 12 18:06:48 crc kubenswrapper[4926]: I0312 18:06:48.465295 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" event={"ID":"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a","Type":"ContainerDied","Data":"5c2190b7dcff0a0981911cc6e9cc8e221d3f909aececbc7028daeebdaaddefca"} Mar 12 18:06:48 crc kubenswrapper[4926]: I0312 18:06:48.467913 4926 generic.go:334] "Generic (PLEG): container finished" podID="7e957b87-36f8-433f-9547-70bf22354018" containerID="404fc2dbd169e6a506a47a16f7ea299950a21ae0b3144c3fe7d945d26a749973" exitCode=0 Mar 12 18:06:48 crc kubenswrapper[4926]: I0312 18:06:48.467987 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" event={"ID":"7e957b87-36f8-433f-9547-70bf22354018","Type":"ContainerDied","Data":"404fc2dbd169e6a506a47a16f7ea299950a21ae0b3144c3fe7d945d26a749973"} Mar 12 18:06:51 crc kubenswrapper[4926]: I0312 18:06:51.033479 4926 patch_prober.go:28] interesting pod/controller-manager-747cdc9dc9-hpk65 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 12 18:06:51 crc kubenswrapper[4926]: I0312 18:06:51.033599 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 12 18:06:53 crc kubenswrapper[4926]: I0312 18:06:53.067125 4926 patch_prober.go:28] interesting pod/route-controller-manager-758549c5dd-zclbs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 12 18:06:53 crc kubenswrapper[4926]: I0312 18:06:53.067429 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" podUID="7e957b87-36f8-433f-9547-70bf22354018" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 12 18:06:53 crc kubenswrapper[4926]: I0312 18:06:53.208779 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:06:54 crc kubenswrapper[4926]: I0312 18:06:54.003893 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:54 crc kubenswrapper[4926]: I0312 18:06:54.007916 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:06:55 crc kubenswrapper[4926]: I0312 18:06:55.684963 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-st67x" Mar 12 18:06:56 crc kubenswrapper[4926]: I0312 18:06:56.818132 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:06:56 crc kubenswrapper[4926]: I0312 18:06:56.818220 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.530536 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.530975 4926 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 18:06:57 crc kubenswrapper[4926]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 12 18:06:57 crc kubenswrapper[4926]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tl7pp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29555646-wqpkb_openshift-infra(d68160cf-4e6c-4294-bfdc-4acb74637ecb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 12 18:06:57 crc kubenswrapper[4926]: > logger="UnhandledError" Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.532121 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" podUID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.934379 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.945691 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.964686 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx"] Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.964945 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e957b87-36f8-433f-9547-70bf22354018" containerName="route-controller-manager" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.964960 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e957b87-36f8-433f-9547-70bf22354018" containerName="route-controller-manager" Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.964981 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00c7182-7f78-4452-a377-670193a7366b" containerName="pruner" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.964990 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00c7182-7f78-4452-a377-670193a7366b" containerName="pruner" Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.965005 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0156cb9a-b076-4cc2-9a14-68e06a5bbed4" containerName="pruner" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965013 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="0156cb9a-b076-4cc2-9a14-68e06a5bbed4" containerName="pruner" Mar 12 18:06:57 crc kubenswrapper[4926]: E0312 18:06:57.965029 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerName="controller-manager" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965037 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerName="controller-manager" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965151 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="0156cb9a-b076-4cc2-9a14-68e06a5bbed4" containerName="pruner" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965171 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00c7182-7f78-4452-a377-670193a7366b" containerName="pruner" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965186 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e957b87-36f8-433f-9547-70bf22354018" containerName="route-controller-manager" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965195 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" containerName="controller-manager" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.965615 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:57 crc kubenswrapper[4926]: I0312 18:06:57.980030 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx"] Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.013223 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb6dx\" (UniqueName: \"kubernetes.io/projected/7e957b87-36f8-433f-9547-70bf22354018-kube-api-access-xb6dx\") pod \"7e957b87-36f8-433f-9547-70bf22354018\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.014946 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-client-ca\") pod \"7e957b87-36f8-433f-9547-70bf22354018\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.015116 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hbtx\" (UniqueName: \"kubernetes.io/projected/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-kube-api-access-5hbtx\") pod \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.015161 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-config\") pod \"7e957b87-36f8-433f-9547-70bf22354018\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.015957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e957b87-36f8-433f-9547-70bf22354018" (UID: "7e957b87-36f8-433f-9547-70bf22354018"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.020424 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-config" (OuterVolumeSpecName: "config") pod "7e957b87-36f8-433f-9547-70bf22354018" (UID: "7e957b87-36f8-433f-9547-70bf22354018"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.021156 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-kube-api-access-5hbtx" (OuterVolumeSpecName: "kube-api-access-5hbtx") pod "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" (UID: "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a"). InnerVolumeSpecName "kube-api-access-5hbtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.023504 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e957b87-36f8-433f-9547-70bf22354018-kube-api-access-xb6dx" (OuterVolumeSpecName: "kube-api-access-xb6dx") pod "7e957b87-36f8-433f-9547-70bf22354018" (UID: "7e957b87-36f8-433f-9547-70bf22354018"). InnerVolumeSpecName "kube-api-access-xb6dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.023698 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-config\") pod \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.023784 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-serving-cert\") pod \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.024998 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-config" (OuterVolumeSpecName: "config") pod "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" (UID: "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.025086 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-client-ca\") pod \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.025113 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-proxy-ca-bundles\") pod \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\" (UID: \"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.025715 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e957b87-36f8-433f-9547-70bf22354018-serving-cert\") pod \"7e957b87-36f8-433f-9547-70bf22354018\" (UID: \"7e957b87-36f8-433f-9547-70bf22354018\") " Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.025765 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" (UID: "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026206 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxl78\" (UniqueName: \"kubernetes.io/projected/91f58aee-72fa-449f-8ec9-c80f89e5bc39-kube-api-access-jxl78\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026430 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-client-ca" (OuterVolumeSpecName: "client-ca") pod "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" (UID: "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026567 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-config\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026641 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f58aee-72fa-449f-8ec9-c80f89e5bc39-serving-cert\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026692 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-client-ca\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026808 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb6dx\" (UniqueName: \"kubernetes.io/projected/7e957b87-36f8-433f-9547-70bf22354018-kube-api-access-xb6dx\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026825 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026838 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hbtx\" (UniqueName: \"kubernetes.io/projected/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-kube-api-access-5hbtx\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026849 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e957b87-36f8-433f-9547-70bf22354018-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026860 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026871 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.026881 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.028254 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e957b87-36f8-433f-9547-70bf22354018-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e957b87-36f8-433f-9547-70bf22354018" (UID: "7e957b87-36f8-433f-9547-70bf22354018"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.028380 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" (UID: "8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.128276 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxl78\" (UniqueName: \"kubernetes.io/projected/91f58aee-72fa-449f-8ec9-c80f89e5bc39-kube-api-access-jxl78\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.128387 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-config\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.128430 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f58aee-72fa-449f-8ec9-c80f89e5bc39-serving-cert\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.128473 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-client-ca\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.128522 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e957b87-36f8-433f-9547-70bf22354018-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.128536 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.129453 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-client-ca\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.129716 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-config\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.133095 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f58aee-72fa-449f-8ec9-c80f89e5bc39-serving-cert\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.144609 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxl78\" (UniqueName: \"kubernetes.io/projected/91f58aee-72fa-449f-8ec9-c80f89e5bc39-kube-api-access-jxl78\") pod \"route-controller-manager-b6dbb7676-l48qx\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.298967 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.530313 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" event={"ID":"8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a","Type":"ContainerDied","Data":"9b97c5c1e3245e013da7b8a53533ab149e2f2a6606fb877d381e7f0febac2c97"} Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.530336 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-747cdc9dc9-hpk65" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.530368 4926 scope.go:117] "RemoveContainer" containerID="5c2190b7dcff0a0981911cc6e9cc8e221d3f909aececbc7028daeebdaaddefca" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.532877 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.533157 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs" event={"ID":"7e957b87-36f8-433f-9547-70bf22354018","Type":"ContainerDied","Data":"9f487eeed2150c2eeef5da93b9c67da82e7c33df3d94484e998aa4c72eec6d5a"} Mar 12 18:06:58 crc kubenswrapper[4926]: E0312 18:06:58.534068 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" podUID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.551131 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-747cdc9dc9-hpk65"] Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.553570 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-747cdc9dc9-hpk65"] Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.570178 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs"] Mar 12 18:06:58 crc kubenswrapper[4926]: I0312 18:06:58.572575 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-758549c5dd-zclbs"] Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.496204 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e957b87-36f8-433f-9547-70bf22354018" path="/var/lib/kubelet/pods/7e957b87-36f8-433f-9547-70bf22354018/volumes" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.497057 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a" path="/var/lib/kubelet/pods/8fe6b3ba-075b-4f7f-bc18-f6e4c1e4713a/volumes" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.767601 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c9659bb69-k2bhk"] Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.768761 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.773342 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.774160 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.775317 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.775960 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.776025 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.776034 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.777167 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c9659bb69-k2bhk"] Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.787064 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.870184 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-client-ca\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.870227 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-proxy-ca-bundles\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.870260 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-config\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.870280 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f96e9f-9853-4f3b-8c84-3ba79491d133-serving-cert\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.870300 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49sdm\" (UniqueName: \"kubernetes.io/projected/23f96e9f-9853-4f3b-8c84-3ba79491d133-kube-api-access-49sdm\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.971568 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-client-ca\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.971845 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-proxy-ca-bundles\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.971869 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-config\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.971890 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f96e9f-9853-4f3b-8c84-3ba79491d133-serving-cert\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.971911 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49sdm\" (UniqueName: \"kubernetes.io/projected/23f96e9f-9853-4f3b-8c84-3ba79491d133-kube-api-access-49sdm\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.972527 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-client-ca\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.973325 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-proxy-ca-bundles\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.974594 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-config\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.976786 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f96e9f-9853-4f3b-8c84-3ba79491d133-serving-cert\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:00 crc kubenswrapper[4926]: I0312 18:07:00.991750 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49sdm\" (UniqueName: \"kubernetes.io/projected/23f96e9f-9853-4f3b-8c84-3ba79491d133-kube-api-access-49sdm\") pod \"controller-manager-7c9659bb69-k2bhk\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:01 crc kubenswrapper[4926]: I0312 18:07:01.106831 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:02 crc kubenswrapper[4926]: E0312 18:07:02.145034 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 18:07:02 crc kubenswrapper[4926]: E0312 18:07:02.145494 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnjm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-t4sbh_openshift-marketplace(150781c8-5ae3-42a6-b351-2388dfe84167): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:02 crc kubenswrapper[4926]: E0312 18:07:02.146708 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-t4sbh" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" Mar 12 18:07:02 crc kubenswrapper[4926]: E0312 18:07:02.147851 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 18:07:02 crc kubenswrapper[4926]: E0312 18:07:02.147989 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wf7tf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-s4zf5_openshift-marketplace(b2a609cd-c298-4356-9ddf-a7f125b52938): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:02 crc kubenswrapper[4926]: E0312 18:07:02.149555 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-s4zf5" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" Mar 12 18:07:03 crc kubenswrapper[4926]: E0312 18:07:03.683219 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-s4zf5" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" Mar 12 18:07:03 crc kubenswrapper[4926]: E0312 18:07:03.683225 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-t4sbh" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" Mar 12 18:07:03 crc kubenswrapper[4926]: E0312 18:07:03.738297 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 12 18:07:03 crc kubenswrapper[4926]: E0312 18:07:03.738536 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pkgrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v5qw8_openshift-marketplace(8630848f-c268-4f4a-9fd0-8f33765c20b4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:03 crc kubenswrapper[4926]: E0312 18:07:03.739746 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v5qw8" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" Mar 12 18:07:04 crc kubenswrapper[4926]: I0312 18:07:04.792112 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7rmsn"] Mar 12 18:07:05 crc kubenswrapper[4926]: I0312 18:07:05.885379 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jfb58" Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.828386 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.829314 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.831301 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.831810 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.834525 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.945113 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ec2202-9e22-4586-af8c-020d00160c26-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:06 crc kubenswrapper[4926]: I0312 18:07:06.945167 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ec2202-9e22-4586-af8c-020d00160c26-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.046539 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ec2202-9e22-4586-af8c-020d00160c26-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.046613 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ec2202-9e22-4586-af8c-020d00160c26-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.046722 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ec2202-9e22-4586-af8c-020d00160c26-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.068507 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ec2202-9e22-4586-af8c-020d00160c26-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.133745 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9659bb69-k2bhk"] Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.150266 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.229618 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx"] Mar 12 18:07:07 crc kubenswrapper[4926]: E0312 18:07:07.784665 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v5qw8" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" Mar 12 18:07:07 crc kubenswrapper[4926]: E0312 18:07:07.881650 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 12 18:07:07 crc kubenswrapper[4926]: E0312 18:07:07.881806 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v4msz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hmn6x_openshift-marketplace(1f425571-9ce5-4fdc-9631-7683efa292aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:07 crc kubenswrapper[4926]: E0312 18:07:07.883022 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hmn6x" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" Mar 12 18:07:07 crc kubenswrapper[4926]: I0312 18:07:07.978606 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n7pd7"] Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.577255 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hmn6x" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" Mar 12 18:07:09 crc kubenswrapper[4926]: I0312 18:07:09.587286 4926 scope.go:117] "RemoveContainer" containerID="404fc2dbd169e6a506a47a16f7ea299950a21ae0b3144c3fe7d945d26a749973" Mar 12 18:07:09 crc kubenswrapper[4926]: W0312 18:07:09.608994 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211eeae6_9b41_484b_bd13_99c1c28cdf96.slice/crio-fa523477acd29a14cfd122e3283db6ab6e928eb05746f29ea2b64f89366a97aa WatchSource:0}: Error finding container fa523477acd29a14cfd122e3283db6ab6e928eb05746f29ea2b64f89366a97aa: Status 404 returned error can't find the container with id fa523477acd29a14cfd122e3283db6ab6e928eb05746f29ea2b64f89366a97aa Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.668480 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.668610 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwlr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wz6qt_openshift-marketplace(702daa2d-851e-4c3d-be86-4f337b4462f7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.669925 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wz6qt" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.670597 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.670670 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wxfn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-565fl_openshift-marketplace(b5fe4032-6a1e-4c27-9471-fa53e044826e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.671729 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-565fl" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.699590 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.699925 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57ncv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zx9h7_openshift-marketplace(637236a6-6287-401d-a2cd-78713aa03176): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.701094 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zx9h7" podUID="637236a6-6287-401d-a2cd-78713aa03176" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.718869 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.719031 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fsdsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zpnq8_openshift-marketplace(743a7318-33d0-4a59-93bd-7c6899554e5e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 18:07:09 crc kubenswrapper[4926]: E0312 18:07:09.720375 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zpnq8" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" Mar 12 18:07:09 crc kubenswrapper[4926]: I0312 18:07:09.858572 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx"] Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.132680 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.138769 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9659bb69-k2bhk"] Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.594623 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" event={"ID":"211eeae6-9b41-484b-bd13-99c1c28cdf96","Type":"ContainerStarted","Data":"c7928a2bceb99f4b4564518463fb197c452d1977bbe5be608c6738681ef865ed"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.594935 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" event={"ID":"211eeae6-9b41-484b-bd13-99c1c28cdf96","Type":"ContainerStarted","Data":"b09c4ecfbce0d77ab595c1787ce759be04f8ca207eafee9fec5fb1a2e4086727"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.594946 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n7pd7" event={"ID":"211eeae6-9b41-484b-bd13-99c1c28cdf96","Type":"ContainerStarted","Data":"fa523477acd29a14cfd122e3283db6ab6e928eb05746f29ea2b64f89366a97aa"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.597181 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" event={"ID":"23f96e9f-9853-4f3b-8c84-3ba79491d133","Type":"ContainerStarted","Data":"232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.597205 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" event={"ID":"23f96e9f-9853-4f3b-8c84-3ba79491d133","Type":"ContainerStarted","Data":"b7ab31a5c8fe6aa803d3d10a661aa331fd52cf721b1a3dc48f874195bb4a0809"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.597282 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" podUID="23f96e9f-9853-4f3b-8c84-3ba79491d133" containerName="controller-manager" containerID="cri-o://232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296" gracePeriod=30 Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.597678 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.601212 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" event={"ID":"91f58aee-72fa-449f-8ec9-c80f89e5bc39","Type":"ContainerStarted","Data":"a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.601239 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" event={"ID":"91f58aee-72fa-449f-8ec9-c80f89e5bc39","Type":"ContainerStarted","Data":"3adfd4583ba63f15a8883e043574aecc69b295fca7e18080bc6085933f638518"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.601322 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" podUID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" containerName="route-controller-manager" containerID="cri-o://a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66" gracePeriod=30 Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.601586 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.602484 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.609989 4926 patch_prober.go:28] interesting pod/route-controller-manager-b6dbb7676-l48qx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:38340->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.610035 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" podUID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:38340->10.217.0.57:8443: read: connection reset by peer" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.610303 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51ec2202-9e22-4586-af8c-020d00160c26","Type":"ContainerStarted","Data":"bd5240b017915f06ca1867df3058775494e0b616b22b2960334e2eca5d692bcd"} Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.610332 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51ec2202-9e22-4586-af8c-020d00160c26","Type":"ContainerStarted","Data":"7460ec72f709ffd09a704af8c7bdb191459d3a1e2566af183b451949d0331dba"} Mar 12 18:07:10 crc kubenswrapper[4926]: E0312 18:07:10.613745 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zx9h7" podUID="637236a6-6287-401d-a2cd-78713aa03176" Mar 12 18:07:10 crc kubenswrapper[4926]: E0312 18:07:10.615399 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zpnq8" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" Mar 12 18:07:10 crc kubenswrapper[4926]: E0312 18:07:10.615416 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-565fl" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" Mar 12 18:07:10 crc kubenswrapper[4926]: E0312 18:07:10.615485 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wz6qt" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.627865 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n7pd7" podStartSLOduration=199.627845898 podStartE2EDuration="3m19.627845898s" podCreationTimestamp="2026-03-12 18:03:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:10.625105612 +0000 UTC m=+270.993731955" watchObservedRunningTime="2026-03-12 18:07:10.627845898 +0000 UTC m=+270.996472221" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.699371 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" podStartSLOduration=23.699354864 podStartE2EDuration="23.699354864s" podCreationTimestamp="2026-03-12 18:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:10.696142412 +0000 UTC m=+271.064768745" watchObservedRunningTime="2026-03-12 18:07:10.699354864 +0000 UTC m=+271.067981187" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.732123 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=4.732090766 podStartE2EDuration="4.732090766s" podCreationTimestamp="2026-03-12 18:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:10.729786044 +0000 UTC m=+271.098412387" watchObservedRunningTime="2026-03-12 18:07:10.732090766 +0000 UTC m=+271.100717099" Mar 12 18:07:10 crc kubenswrapper[4926]: I0312 18:07:10.778231 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" podStartSLOduration=23.778212172 podStartE2EDuration="23.778212172s" podCreationTimestamp="2026-03-12 18:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:10.776762926 +0000 UTC m=+271.145389279" watchObservedRunningTime="2026-03-12 18:07:10.778212172 +0000 UTC m=+271.146838505" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.072609 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.085271 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.103153 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d"] Mar 12 18:07:11 crc kubenswrapper[4926]: E0312 18:07:11.103409 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" containerName="route-controller-manager" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.103424 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" containerName="route-controller-manager" Mar 12 18:07:11 crc kubenswrapper[4926]: E0312 18:07:11.105606 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23f96e9f-9853-4f3b-8c84-3ba79491d133" containerName="controller-manager" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.105663 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f96e9f-9853-4f3b-8c84-3ba79491d133" containerName="controller-manager" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.105933 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" containerName="route-controller-manager" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.105951 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="23f96e9f-9853-4f3b-8c84-3ba79491d133" containerName="controller-manager" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.106457 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.113010 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d"] Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.211752 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-client-ca\") pod \"23f96e9f-9853-4f3b-8c84-3ba79491d133\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.211819 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-proxy-ca-bundles\") pod \"23f96e9f-9853-4f3b-8c84-3ba79491d133\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.211864 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f58aee-72fa-449f-8ec9-c80f89e5bc39-serving-cert\") pod \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.211902 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-config\") pod \"23f96e9f-9853-4f3b-8c84-3ba79491d133\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.211931 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxl78\" (UniqueName: \"kubernetes.io/projected/91f58aee-72fa-449f-8ec9-c80f89e5bc39-kube-api-access-jxl78\") pod \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.211982 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49sdm\" (UniqueName: \"kubernetes.io/projected/23f96e9f-9853-4f3b-8c84-3ba79491d133-kube-api-access-49sdm\") pod \"23f96e9f-9853-4f3b-8c84-3ba79491d133\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212031 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-config\") pod \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212066 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-client-ca\") pod \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\" (UID: \"91f58aee-72fa-449f-8ec9-c80f89e5bc39\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212098 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f96e9f-9853-4f3b-8c84-3ba79491d133-serving-cert\") pod \"23f96e9f-9853-4f3b-8c84-3ba79491d133\" (UID: \"23f96e9f-9853-4f3b-8c84-3ba79491d133\") " Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212298 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzg9f\" (UniqueName: \"kubernetes.io/projected/d9a46d07-aea8-4496-b088-1d30771449a5-kube-api-access-pzg9f\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212347 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-client-ca\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212385 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-proxy-ca-bundles\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212409 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-config\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212524 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-client-ca" (OuterVolumeSpecName: "client-ca") pod "23f96e9f-9853-4f3b-8c84-3ba79491d133" (UID: "23f96e9f-9853-4f3b-8c84-3ba79491d133"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.212582 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "23f96e9f-9853-4f3b-8c84-3ba79491d133" (UID: "23f96e9f-9853-4f3b-8c84-3ba79491d133"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213338 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-config" (OuterVolumeSpecName: "config") pod "23f96e9f-9853-4f3b-8c84-3ba79491d133" (UID: "23f96e9f-9853-4f3b-8c84-3ba79491d133"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213579 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-config" (OuterVolumeSpecName: "config") pod "91f58aee-72fa-449f-8ec9-c80f89e5bc39" (UID: "91f58aee-72fa-449f-8ec9-c80f89e5bc39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213746 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a46d07-aea8-4496-b088-1d30771449a5-serving-cert\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213877 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213894 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213908 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.213921 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f96e9f-9853-4f3b-8c84-3ba79491d133-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.214000 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-client-ca" (OuterVolumeSpecName: "client-ca") pod "91f58aee-72fa-449f-8ec9-c80f89e5bc39" (UID: "91f58aee-72fa-449f-8ec9-c80f89e5bc39"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.217763 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f96e9f-9853-4f3b-8c84-3ba79491d133-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23f96e9f-9853-4f3b-8c84-3ba79491d133" (UID: "23f96e9f-9853-4f3b-8c84-3ba79491d133"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.217934 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91f58aee-72fa-449f-8ec9-c80f89e5bc39-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "91f58aee-72fa-449f-8ec9-c80f89e5bc39" (UID: "91f58aee-72fa-449f-8ec9-c80f89e5bc39"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.218227 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f96e9f-9853-4f3b-8c84-3ba79491d133-kube-api-access-49sdm" (OuterVolumeSpecName: "kube-api-access-49sdm") pod "23f96e9f-9853-4f3b-8c84-3ba79491d133" (UID: "23f96e9f-9853-4f3b-8c84-3ba79491d133"). InnerVolumeSpecName "kube-api-access-49sdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.218306 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91f58aee-72fa-449f-8ec9-c80f89e5bc39-kube-api-access-jxl78" (OuterVolumeSpecName: "kube-api-access-jxl78") pod "91f58aee-72fa-449f-8ec9-c80f89e5bc39" (UID: "91f58aee-72fa-449f-8ec9-c80f89e5bc39"). InnerVolumeSpecName "kube-api-access-jxl78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.314909 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-config\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.314981 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a46d07-aea8-4496-b088-1d30771449a5-serving-cert\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315027 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzg9f\" (UniqueName: \"kubernetes.io/projected/d9a46d07-aea8-4496-b088-1d30771449a5-kube-api-access-pzg9f\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315057 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-client-ca\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315090 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-proxy-ca-bundles\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315131 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/91f58aee-72fa-449f-8ec9-c80f89e5bc39-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315143 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23f96e9f-9853-4f3b-8c84-3ba79491d133-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315152 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91f58aee-72fa-449f-8ec9-c80f89e5bc39-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315162 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxl78\" (UniqueName: \"kubernetes.io/projected/91f58aee-72fa-449f-8ec9-c80f89e5bc39-kube-api-access-jxl78\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.315173 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49sdm\" (UniqueName: \"kubernetes.io/projected/23f96e9f-9853-4f3b-8c84-3ba79491d133-kube-api-access-49sdm\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.316263 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-client-ca\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.317501 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-config\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.319419 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a46d07-aea8-4496-b088-1d30771449a5-serving-cert\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.319491 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-proxy-ca-bundles\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.330038 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzg9f\" (UniqueName: \"kubernetes.io/projected/d9a46d07-aea8-4496-b088-1d30771449a5-kube-api-access-pzg9f\") pod \"controller-manager-57d6b6c74b-ndw9d\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.480884 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.620804 4926 generic.go:334] "Generic (PLEG): container finished" podID="51ec2202-9e22-4586-af8c-020d00160c26" containerID="bd5240b017915f06ca1867df3058775494e0b616b22b2960334e2eca5d692bcd" exitCode=0 Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.620935 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51ec2202-9e22-4586-af8c-020d00160c26","Type":"ContainerDied","Data":"bd5240b017915f06ca1867df3058775494e0b616b22b2960334e2eca5d692bcd"} Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.624050 4926 generic.go:334] "Generic (PLEG): container finished" podID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" containerID="a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66" exitCode=0 Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.624188 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.624242 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" event={"ID":"91f58aee-72fa-449f-8ec9-c80f89e5bc39","Type":"ContainerDied","Data":"a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66"} Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.624283 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx" event={"ID":"91f58aee-72fa-449f-8ec9-c80f89e5bc39","Type":"ContainerDied","Data":"3adfd4583ba63f15a8883e043574aecc69b295fca7e18080bc6085933f638518"} Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.624303 4926 scope.go:117] "RemoveContainer" containerID="a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.626828 4926 generic.go:334] "Generic (PLEG): container finished" podID="23f96e9f-9853-4f3b-8c84-3ba79491d133" containerID="232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296" exitCode=0 Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.627422 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.632574 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" event={"ID":"23f96e9f-9853-4f3b-8c84-3ba79491d133","Type":"ContainerDied","Data":"232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296"} Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.632626 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c9659bb69-k2bhk" event={"ID":"23f96e9f-9853-4f3b-8c84-3ba79491d133","Type":"ContainerDied","Data":"b7ab31a5c8fe6aa803d3d10a661aa331fd52cf721b1a3dc48f874195bb4a0809"} Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.654478 4926 scope.go:117] "RemoveContainer" containerID="a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66" Mar 12 18:07:11 crc kubenswrapper[4926]: E0312 18:07:11.655389 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66\": container with ID starting with a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66 not found: ID does not exist" containerID="a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.655485 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66"} err="failed to get container status \"a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66\": rpc error: code = NotFound desc = could not find container \"a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66\": container with ID starting with a14fd899c389a50963ad79b00b1c3a6ea8e5844f2a8769831d6f4a4c8d367d66 not found: ID does not exist" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.655516 4926 scope.go:117] "RemoveContainer" containerID="232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.663041 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c9659bb69-k2bhk"] Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.668478 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c9659bb69-k2bhk"] Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.671086 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx"] Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.673861 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b6dbb7676-l48qx"] Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.677182 4926 scope.go:117] "RemoveContainer" containerID="232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296" Mar 12 18:07:11 crc kubenswrapper[4926]: E0312 18:07:11.677617 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296\": container with ID starting with 232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296 not found: ID does not exist" containerID="232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.677653 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296"} err="failed to get container status \"232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296\": rpc error: code = NotFound desc = could not find container \"232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296\": container with ID starting with 232bc05bcf31de44ab262bfed931d35f35e41de12c12d164f104762a81f8c296 not found: ID does not exist" Mar 12 18:07:11 crc kubenswrapper[4926]: I0312 18:07:11.859111 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d"] Mar 12 18:07:11 crc kubenswrapper[4926]: W0312 18:07:11.865078 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9a46d07_aea8_4496_b088_1d30771449a5.slice/crio-970f69b22a7b41275946a8af481797c27d777726022fcf08385e195d2cc5696f WatchSource:0}: Error finding container 970f69b22a7b41275946a8af481797c27d777726022fcf08385e195d2cc5696f: Status 404 returned error can't find the container with id 970f69b22a7b41275946a8af481797c27d777726022fcf08385e195d2cc5696f Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.442024 4926 csr.go:261] certificate signing request csr-kmkcd is approved, waiting to be issued Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.447909 4926 csr.go:257] certificate signing request csr-kmkcd is issued Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.503284 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f96e9f-9853-4f3b-8c84-3ba79491d133" path="/var/lib/kubelet/pods/23f96e9f-9853-4f3b-8c84-3ba79491d133/volumes" Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.503780 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91f58aee-72fa-449f-8ec9-c80f89e5bc39" path="/var/lib/kubelet/pods/91f58aee-72fa-449f-8ec9-c80f89e5bc39/volumes" Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.634581 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" event={"ID":"d9a46d07-aea8-4496-b088-1d30771449a5","Type":"ContainerStarted","Data":"2e38aace17ed1402804edbaeeb451a49615dca3e7706295cc9da9a75fd5336e5"} Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.634852 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.634866 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" event={"ID":"d9a46d07-aea8-4496-b088-1d30771449a5","Type":"ContainerStarted","Data":"970f69b22a7b41275946a8af481797c27d777726022fcf08385e195d2cc5696f"} Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.636921 4926 generic.go:334] "Generic (PLEG): container finished" podID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" containerID="52e8bc12e7903e1d736ebf550d285fa84f2e22f28d746c1ea7750000e16f3903" exitCode=0 Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.637239 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" event={"ID":"d68160cf-4e6c-4294-bfdc-4acb74637ecb","Type":"ContainerDied","Data":"52e8bc12e7903e1d736ebf550d285fa84f2e22f28d746c1ea7750000e16f3903"} Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.641685 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.659713 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" podStartSLOduration=5.6596979229999995 podStartE2EDuration="5.659697923s" podCreationTimestamp="2026-03-12 18:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:12.655038086 +0000 UTC m=+273.023664419" watchObservedRunningTime="2026-03-12 18:07:12.659697923 +0000 UTC m=+273.028324256" Mar 12 18:07:12 crc kubenswrapper[4926]: I0312 18:07:12.920933 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.034747 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ec2202-9e22-4586-af8c-020d00160c26-kube-api-access\") pod \"51ec2202-9e22-4586-af8c-020d00160c26\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.034847 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ec2202-9e22-4586-af8c-020d00160c26-kubelet-dir\") pod \"51ec2202-9e22-4586-af8c-020d00160c26\" (UID: \"51ec2202-9e22-4586-af8c-020d00160c26\") " Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.034951 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51ec2202-9e22-4586-af8c-020d00160c26-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "51ec2202-9e22-4586-af8c-020d00160c26" (UID: "51ec2202-9e22-4586-af8c-020d00160c26"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.035129 4926 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ec2202-9e22-4586-af8c-020d00160c26-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.040047 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ec2202-9e22-4586-af8c-020d00160c26-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "51ec2202-9e22-4586-af8c-020d00160c26" (UID: "51ec2202-9e22-4586-af8c-020d00160c26"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.136892 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ec2202-9e22-4586-af8c-020d00160c26-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.448944 4926 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 13:33:33.890548427 +0000 UTC Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.448984 4926 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7171h26m20.441566951s for next certificate rotation Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.644357 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51ec2202-9e22-4586-af8c-020d00160c26","Type":"ContainerDied","Data":"7460ec72f709ffd09a704af8c7bdb191459d3a1e2566af183b451949d0331dba"} Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.644504 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.644529 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7460ec72f709ffd09a704af8c7bdb191459d3a1e2566af183b451949d0331dba" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.780838 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq"] Mar 12 18:07:13 crc kubenswrapper[4926]: E0312 18:07:13.781164 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ec2202-9e22-4586-af8c-020d00160c26" containerName="pruner" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.781176 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ec2202-9e22-4586-af8c-020d00160c26" containerName="pruner" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.781275 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ec2202-9e22-4586-af8c-020d00160c26" containerName="pruner" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.781598 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq"] Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.781745 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.833637 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.833722 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.833859 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.833990 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.834006 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.834096 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.854365 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfl44\" (UniqueName: \"kubernetes.io/projected/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-kube-api-access-cfl44\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.854463 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-config\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.854497 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-client-ca\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.854841 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-serving-cert\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.945240 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.956208 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-serving-cert\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.956274 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfl44\" (UniqueName: \"kubernetes.io/projected/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-kube-api-access-cfl44\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.956308 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-config\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.956328 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-client-ca\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.958194 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-client-ca\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.958612 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-config\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.960517 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-serving-cert\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:13 crc kubenswrapper[4926]: I0312 18:07:13.972142 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfl44\" (UniqueName: \"kubernetes.io/projected/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-kube-api-access-cfl44\") pod \"route-controller-manager-56fbddc998-ghqvq\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.057183 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl7pp\" (UniqueName: \"kubernetes.io/projected/d68160cf-4e6c-4294-bfdc-4acb74637ecb-kube-api-access-tl7pp\") pod \"d68160cf-4e6c-4294-bfdc-4acb74637ecb\" (UID: \"d68160cf-4e6c-4294-bfdc-4acb74637ecb\") " Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.060220 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68160cf-4e6c-4294-bfdc-4acb74637ecb-kube-api-access-tl7pp" (OuterVolumeSpecName: "kube-api-access-tl7pp") pod "d68160cf-4e6c-4294-bfdc-4acb74637ecb" (UID: "d68160cf-4e6c-4294-bfdc-4acb74637ecb"). InnerVolumeSpecName "kube-api-access-tl7pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.153072 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.158370 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl7pp\" (UniqueName: \"kubernetes.io/projected/d68160cf-4e6c-4294-bfdc-4acb74637ecb-kube-api-access-tl7pp\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.449523 4926 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-13 11:26:27.293792942 +0000 UTC Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.449559 4926 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5897h19m12.844235889s for next certificate rotation Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.534934 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.543713 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq"] Mar 12 18:07:14 crc kubenswrapper[4926]: W0312 18:07:14.557713 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ffc00d2_8ad7_4109_aab6_18c87d0bc51e.slice/crio-d9af210d4e051b95f3fb2613a9ee5e48096656a2e0c5a0f8ba2b6c022529ebcc WatchSource:0}: Error finding container d9af210d4e051b95f3fb2613a9ee5e48096656a2e0c5a0f8ba2b6c022529ebcc: Status 404 returned error can't find the container with id d9af210d4e051b95f3fb2613a9ee5e48096656a2e0c5a0f8ba2b6c022529ebcc Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.652260 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" event={"ID":"d68160cf-4e6c-4294-bfdc-4acb74637ecb","Type":"ContainerDied","Data":"71f803a395609858e2670b2b53c7204676afa8a634426e199a4c62b07d977042"} Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.652572 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f803a395609858e2670b2b53c7204676afa8a634426e199a4c62b07d977042" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.652298 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555646-wqpkb" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.653591 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" event={"ID":"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e","Type":"ContainerStarted","Data":"d9af210d4e051b95f3fb2613a9ee5e48096656a2e0c5a0f8ba2b6c022529ebcc"} Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.825519 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 18:07:14 crc kubenswrapper[4926]: E0312 18:07:14.825786 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" containerName="oc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.825806 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" containerName="oc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.825926 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" containerName="oc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.826341 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.828965 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.829162 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.832618 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.866398 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-var-lock\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.866512 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kube-api-access\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.866537 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.967674 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-var-lock\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.967739 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kube-api-access\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.967754 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-var-lock\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.967772 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.967903 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:14 crc kubenswrapper[4926]: I0312 18:07:14.988744 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kube-api-access\") pod \"installer-9-crc\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.151907 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.568404 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 18:07:15 crc kubenswrapper[4926]: W0312 18:07:15.572620 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod63c14bbd_6ba6_42c2_9e94_bf3a6f68500f.slice/crio-491a64d2c203ce09a50339566c321ae1dad16f74cdb93b33b26da5d1f82b4544 WatchSource:0}: Error finding container 491a64d2c203ce09a50339566c321ae1dad16f74cdb93b33b26da5d1f82b4544: Status 404 returned error can't find the container with id 491a64d2c203ce09a50339566c321ae1dad16f74cdb93b33b26da5d1f82b4544 Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.660985 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f","Type":"ContainerStarted","Data":"491a64d2c203ce09a50339566c321ae1dad16f74cdb93b33b26da5d1f82b4544"} Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.663147 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" event={"ID":"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e","Type":"ContainerStarted","Data":"d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153"} Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.663711 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.669512 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:15 crc kubenswrapper[4926]: I0312 18:07:15.682677 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" podStartSLOduration=8.682632268 podStartE2EDuration="8.682632268s" podCreationTimestamp="2026-03-12 18:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:15.679149188 +0000 UTC m=+276.047775531" watchObservedRunningTime="2026-03-12 18:07:15.682632268 +0000 UTC m=+276.051258631" Mar 12 18:07:16 crc kubenswrapper[4926]: I0312 18:07:16.669635 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f","Type":"ContainerStarted","Data":"988893c573b728d62d468c803dd6aa2333147e0f68cf77f807942af0ae0d7912"} Mar 12 18:07:16 crc kubenswrapper[4926]: I0312 18:07:16.684009 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.683997301 podStartE2EDuration="2.683997301s" podCreationTimestamp="2026-03-12 18:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:16.683565098 +0000 UTC m=+277.052191441" watchObservedRunningTime="2026-03-12 18:07:16.683997301 +0000 UTC m=+277.052623634" Mar 12 18:07:19 crc kubenswrapper[4926]: I0312 18:07:19.684620 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerStarted","Data":"e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a"} Mar 12 18:07:19 crc kubenswrapper[4926]: I0312 18:07:19.686548 4926 generic.go:334] "Generic (PLEG): container finished" podID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerID="f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc" exitCode=0 Mar 12 18:07:19 crc kubenswrapper[4926]: I0312 18:07:19.686576 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4zf5" event={"ID":"b2a609cd-c298-4356-9ddf-a7f125b52938","Type":"ContainerDied","Data":"f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc"} Mar 12 18:07:20 crc kubenswrapper[4926]: I0312 18:07:20.693826 4926 generic.go:334] "Generic (PLEG): container finished" podID="150781c8-5ae3-42a6-b351-2388dfe84167" containerID="e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a" exitCode=0 Mar 12 18:07:20 crc kubenswrapper[4926]: I0312 18:07:20.693932 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerDied","Data":"e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a"} Mar 12 18:07:20 crc kubenswrapper[4926]: I0312 18:07:20.701848 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4zf5" event={"ID":"b2a609cd-c298-4356-9ddf-a7f125b52938","Type":"ContainerStarted","Data":"1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e"} Mar 12 18:07:20 crc kubenswrapper[4926]: I0312 18:07:20.733298 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s4zf5" podStartSLOduration=3.648730875 podStartE2EDuration="50.733283957s" podCreationTimestamp="2026-03-12 18:06:30 +0000 UTC" firstStartedPulling="2026-03-12 18:06:33.018194031 +0000 UTC m=+233.386820364" lastFinishedPulling="2026-03-12 18:07:20.102747103 +0000 UTC m=+280.471373446" observedRunningTime="2026-03-12 18:07:20.73020616 +0000 UTC m=+281.098832493" watchObservedRunningTime="2026-03-12 18:07:20.733283957 +0000 UTC m=+281.101910290" Mar 12 18:07:21 crc kubenswrapper[4926]: I0312 18:07:21.417596 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:07:21 crc kubenswrapper[4926]: I0312 18:07:21.417655 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:07:21 crc kubenswrapper[4926]: I0312 18:07:21.712806 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerStarted","Data":"2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970"} Mar 12 18:07:21 crc kubenswrapper[4926]: I0312 18:07:21.733113 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t4sbh" podStartSLOduration=2.363138664 podStartE2EDuration="50.733096272s" podCreationTimestamp="2026-03-12 18:06:31 +0000 UTC" firstStartedPulling="2026-03-12 18:06:33.023748026 +0000 UTC m=+233.392374359" lastFinishedPulling="2026-03-12 18:07:21.393705604 +0000 UTC m=+281.762331967" observedRunningTime="2026-03-12 18:07:21.727594998 +0000 UTC m=+282.096221351" watchObservedRunningTime="2026-03-12 18:07:21.733096272 +0000 UTC m=+282.101722615" Mar 12 18:07:22 crc kubenswrapper[4926]: I0312 18:07:22.561352 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-s4zf5" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="registry-server" probeResult="failure" output=< Mar 12 18:07:22 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:07:22 crc kubenswrapper[4926]: > Mar 12 18:07:23 crc kubenswrapper[4926]: I0312 18:07:23.727367 4926 generic.go:334] "Generic (PLEG): container finished" podID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerID="d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5" exitCode=0 Mar 12 18:07:23 crc kubenswrapper[4926]: I0312 18:07:23.727416 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-565fl" event={"ID":"b5fe4032-6a1e-4c27-9471-fa53e044826e","Type":"ContainerDied","Data":"d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5"} Mar 12 18:07:24 crc kubenswrapper[4926]: I0312 18:07:24.737238 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-565fl" event={"ID":"b5fe4032-6a1e-4c27-9471-fa53e044826e","Type":"ContainerStarted","Data":"a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8"} Mar 12 18:07:24 crc kubenswrapper[4926]: I0312 18:07:24.766722 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-565fl" podStartSLOduration=2.399645696 podStartE2EDuration="53.766704542s" podCreationTimestamp="2026-03-12 18:06:31 +0000 UTC" firstStartedPulling="2026-03-12 18:06:33.007413821 +0000 UTC m=+233.376040154" lastFinishedPulling="2026-03-12 18:07:24.374472627 +0000 UTC m=+284.743099000" observedRunningTime="2026-03-12 18:07:24.765090542 +0000 UTC m=+285.133716895" watchObservedRunningTime="2026-03-12 18:07:24.766704542 +0000 UTC m=+285.135330875" Mar 12 18:07:25 crc kubenswrapper[4926]: I0312 18:07:25.750877 4926 generic.go:334] "Generic (PLEG): container finished" podID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerID="23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0" exitCode=0 Mar 12 18:07:25 crc kubenswrapper[4926]: I0312 18:07:25.751003 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v5qw8" event={"ID":"8630848f-c268-4f4a-9fd0-8f33765c20b4","Type":"ContainerDied","Data":"23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0"} Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.761378 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v5qw8" event={"ID":"8630848f-c268-4f4a-9fd0-8f33765c20b4","Type":"ContainerStarted","Data":"bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9"} Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.768838 4926 generic.go:334] "Generic (PLEG): container finished" podID="637236a6-6287-401d-a2cd-78713aa03176" containerID="599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb" exitCode=0 Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.768881 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx9h7" event={"ID":"637236a6-6287-401d-a2cd-78713aa03176","Type":"ContainerDied","Data":"599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb"} Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.773712 4926 generic.go:334] "Generic (PLEG): container finished" podID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerID="ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199" exitCode=0 Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.773815 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpnq8" event={"ID":"743a7318-33d0-4a59-93bd-7c6899554e5e","Type":"ContainerDied","Data":"ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199"} Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.777202 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerStarted","Data":"9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429"} Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.779736 4926 generic.go:334] "Generic (PLEG): container finished" podID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerID="946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b" exitCode=0 Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.779778 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz6qt" event={"ID":"702daa2d-851e-4c3d-be86-4f337b4462f7","Type":"ContainerDied","Data":"946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b"} Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.785635 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v5qw8" podStartSLOduration=2.683320053 podStartE2EDuration="53.78561732s" podCreationTimestamp="2026-03-12 18:06:33 +0000 UTC" firstStartedPulling="2026-03-12 18:06:35.158601298 +0000 UTC m=+235.527227631" lastFinishedPulling="2026-03-12 18:07:26.260898555 +0000 UTC m=+286.629524898" observedRunningTime="2026-03-12 18:07:26.782552733 +0000 UTC m=+287.151179086" watchObservedRunningTime="2026-03-12 18:07:26.78561732 +0000 UTC m=+287.154243663" Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.817463 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.817540 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.817596 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.818271 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:07:26 crc kubenswrapper[4926]: I0312 18:07:26.818338 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff" gracePeriod=600 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.161753 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d"] Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.162334 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" podUID="d9a46d07-aea8-4496-b088-1d30771449a5" containerName="controller-manager" containerID="cri-o://2e38aace17ed1402804edbaeeb451a49615dca3e7706295cc9da9a75fd5336e5" gracePeriod=30 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.186398 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq"] Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.186671 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" podUID="5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" containerName="route-controller-manager" containerID="cri-o://d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153" gracePeriod=30 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.686147 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.787021 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz6qt" event={"ID":"702daa2d-851e-4c3d-be86-4f337b4462f7","Type":"ContainerStarted","Data":"9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.788811 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff" exitCode=0 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.788874 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.788923 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"b3aa39c92eb410de660dbb6e5cf8c0dd506addf3c227d2622f913bd6b55014e2"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.791119 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx9h7" event={"ID":"637236a6-6287-401d-a2cd-78713aa03176","Type":"ContainerStarted","Data":"2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.792307 4926 generic.go:334] "Generic (PLEG): container finished" podID="d9a46d07-aea8-4496-b088-1d30771449a5" containerID="2e38aace17ed1402804edbaeeb451a49615dca3e7706295cc9da9a75fd5336e5" exitCode=0 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.792385 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" event={"ID":"d9a46d07-aea8-4496-b088-1d30771449a5","Type":"ContainerDied","Data":"2e38aace17ed1402804edbaeeb451a49615dca3e7706295cc9da9a75fd5336e5"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.795801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpnq8" event={"ID":"743a7318-33d0-4a59-93bd-7c6899554e5e","Type":"ContainerStarted","Data":"75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.796642 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.799264 4926 generic.go:334] "Generic (PLEG): container finished" podID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerID="9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429" exitCode=0 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.799298 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerDied","Data":"9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.801533 4926 generic.go:334] "Generic (PLEG): container finished" podID="5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" containerID="d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153" exitCode=0 Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.801566 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" event={"ID":"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e","Type":"ContainerDied","Data":"d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.801592 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" event={"ID":"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e","Type":"ContainerDied","Data":"d9af210d4e051b95f3fb2613a9ee5e48096656a2e0c5a0f8ba2b6c022529ebcc"} Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.801608 4926 scope.go:117] "RemoveContainer" containerID="d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.801741 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.832907 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wz6qt" podStartSLOduration=2.7855645239999998 podStartE2EDuration="54.832889551s" podCreationTimestamp="2026-03-12 18:06:33 +0000 UTC" firstStartedPulling="2026-03-12 18:06:35.197604664 +0000 UTC m=+235.566230997" lastFinishedPulling="2026-03-12 18:07:27.244929691 +0000 UTC m=+287.613556024" observedRunningTime="2026-03-12 18:07:27.814616715 +0000 UTC m=+288.183243048" watchObservedRunningTime="2026-03-12 18:07:27.832889551 +0000 UTC m=+288.201515884" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.833154 4926 scope.go:117] "RemoveContainer" containerID="d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153" Mar 12 18:07:27 crc kubenswrapper[4926]: E0312 18:07:27.833690 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153\": container with ID starting with d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153 not found: ID does not exist" containerID="d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.833791 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153"} err="failed to get container status \"d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153\": rpc error: code = NotFound desc = could not find container \"d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153\": container with ID starting with d7bed4139ba4a473385e0f0c19ea5e22e1f71a1f309b05c1696dda7157f26153 not found: ID does not exist" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.839363 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-serving-cert\") pod \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.839522 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-config\") pod \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.839598 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-client-ca\") pod \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.839706 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfl44\" (UniqueName: \"kubernetes.io/projected/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-kube-api-access-cfl44\") pod \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\" (UID: \"5ffc00d2-8ad7-4109-aab6-18c87d0bc51e\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.840750 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-config" (OuterVolumeSpecName: "config") pod "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" (UID: "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.841538 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" (UID: "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.846243 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-kube-api-access-cfl44" (OuterVolumeSpecName: "kube-api-access-cfl44") pod "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" (UID: "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e"). InnerVolumeSpecName "kube-api-access-cfl44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.849236 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" (UID: "5ffc00d2-8ad7-4109-aab6-18c87d0bc51e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.857493 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zpnq8" podStartSLOduration=2.437658851 podStartE2EDuration="56.857474667s" podCreationTimestamp="2026-03-12 18:06:31 +0000 UTC" firstStartedPulling="2026-03-12 18:06:33.04112893 +0000 UTC m=+233.409755263" lastFinishedPulling="2026-03-12 18:07:27.460944746 +0000 UTC m=+287.829571079" observedRunningTime="2026-03-12 18:07:27.847758751 +0000 UTC m=+288.216385084" watchObservedRunningTime="2026-03-12 18:07:27.857474667 +0000 UTC m=+288.226101000" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.903371 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zx9h7" podStartSLOduration=2.940827531 podStartE2EDuration="53.903355855s" podCreationTimestamp="2026-03-12 18:06:34 +0000 UTC" firstStartedPulling="2026-03-12 18:06:36.23210791 +0000 UTC m=+236.600734243" lastFinishedPulling="2026-03-12 18:07:27.194636234 +0000 UTC m=+287.563262567" observedRunningTime="2026-03-12 18:07:27.90163736 +0000 UTC m=+288.270263693" watchObservedRunningTime="2026-03-12 18:07:27.903355855 +0000 UTC m=+288.271982188" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941101 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a46d07-aea8-4496-b088-1d30771449a5-serving-cert\") pod \"d9a46d07-aea8-4496-b088-1d30771449a5\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941170 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-proxy-ca-bundles\") pod \"d9a46d07-aea8-4496-b088-1d30771449a5\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941210 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzg9f\" (UniqueName: \"kubernetes.io/projected/d9a46d07-aea8-4496-b088-1d30771449a5-kube-api-access-pzg9f\") pod \"d9a46d07-aea8-4496-b088-1d30771449a5\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941261 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-config\") pod \"d9a46d07-aea8-4496-b088-1d30771449a5\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941293 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-client-ca\") pod \"d9a46d07-aea8-4496-b088-1d30771449a5\" (UID: \"d9a46d07-aea8-4496-b088-1d30771449a5\") " Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941722 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941758 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfl44\" (UniqueName: \"kubernetes.io/projected/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-kube-api-access-cfl44\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941772 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941782 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941944 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d9a46d07-aea8-4496-b088-1d30771449a5" (UID: "d9a46d07-aea8-4496-b088-1d30771449a5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.941997 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-config" (OuterVolumeSpecName: "config") pod "d9a46d07-aea8-4496-b088-1d30771449a5" (UID: "d9a46d07-aea8-4496-b088-1d30771449a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.942275 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9a46d07-aea8-4496-b088-1d30771449a5" (UID: "d9a46d07-aea8-4496-b088-1d30771449a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.944023 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9a46d07-aea8-4496-b088-1d30771449a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9a46d07-aea8-4496-b088-1d30771449a5" (UID: "d9a46d07-aea8-4496-b088-1d30771449a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:27 crc kubenswrapper[4926]: I0312 18:07:27.944993 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a46d07-aea8-4496-b088-1d30771449a5-kube-api-access-pzg9f" (OuterVolumeSpecName: "kube-api-access-pzg9f") pod "d9a46d07-aea8-4496-b088-1d30771449a5" (UID: "d9a46d07-aea8-4496-b088-1d30771449a5"). InnerVolumeSpecName "kube-api-access-pzg9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.042453 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9a46d07-aea8-4496-b088-1d30771449a5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.042763 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.042778 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzg9f\" (UniqueName: \"kubernetes.io/projected/d9a46d07-aea8-4496-b088-1d30771449a5-kube-api-access-pzg9f\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.042790 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.042799 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9a46d07-aea8-4496-b088-1d30771449a5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.129680 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.135255 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fbddc998-ghqvq"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.502625 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" path="/var/lib/kubelet/pods/5ffc00d2-8ad7-4109-aab6-18c87d0bc51e/volumes" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.788146 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74548584bf-x85ch"] Mar 12 18:07:28 crc kubenswrapper[4926]: E0312 18:07:28.788739 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" containerName="route-controller-manager" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.788757 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" containerName="route-controller-manager" Mar 12 18:07:28 crc kubenswrapper[4926]: E0312 18:07:28.788780 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a46d07-aea8-4496-b088-1d30771449a5" containerName="controller-manager" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.788788 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a46d07-aea8-4496-b088-1d30771449a5" containerName="controller-manager" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.788908 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a46d07-aea8-4496-b088-1d30771449a5" containerName="controller-manager" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.788928 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffc00d2-8ad7-4109-aab6-18c87d0bc51e" containerName="route-controller-manager" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.789555 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.791847 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.792670 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.797114 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.797314 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.802827 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.802848 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.802963 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.803099 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.804756 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74548584bf-x85ch"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.810011 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.812877 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerStarted","Data":"9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167"} Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.815782 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" event={"ID":"d9a46d07-aea8-4496-b088-1d30771449a5","Type":"ContainerDied","Data":"970f69b22a7b41275946a8af481797c27d777726022fcf08385e195d2cc5696f"} Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.815828 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.815834 4926 scope.go:117] "RemoveContainer" containerID="2e38aace17ed1402804edbaeeb451a49615dca3e7706295cc9da9a75fd5336e5" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.842493 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.847348 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57d6b6c74b-ndw9d"] Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.864041 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmn6x" podStartSLOduration=2.833563277 podStartE2EDuration="54.864026544s" podCreationTimestamp="2026-03-12 18:06:34 +0000 UTC" firstStartedPulling="2026-03-12 18:06:36.245100261 +0000 UTC m=+236.613726594" lastFinishedPulling="2026-03-12 18:07:28.275563528 +0000 UTC m=+288.644189861" observedRunningTime="2026-03-12 18:07:28.860213894 +0000 UTC m=+289.228840227" watchObservedRunningTime="2026-03-12 18:07:28.864026544 +0000 UTC m=+289.232652877" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.952926 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-serving-cert\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.952999 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8k8\" (UniqueName: \"kubernetes.io/projected/be3d67c9-0e01-48ab-8091-2e0bf103655d-kube-api-access-mw8k8\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953029 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d67c9-0e01-48ab-8091-2e0bf103655d-serving-cert\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953049 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-config\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953071 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-client-ca\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953091 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgt8s\" (UniqueName: \"kubernetes.io/projected/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-kube-api-access-zgt8s\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953115 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-proxy-ca-bundles\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953149 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-config\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:28 crc kubenswrapper[4926]: I0312 18:07:28.953167 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-client-ca\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054290 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-config\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054342 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-client-ca\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054367 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-serving-cert\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054405 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8k8\" (UniqueName: \"kubernetes.io/projected/be3d67c9-0e01-48ab-8091-2e0bf103655d-kube-api-access-mw8k8\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054449 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d67c9-0e01-48ab-8091-2e0bf103655d-serving-cert\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054473 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-config\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054502 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-client-ca\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054521 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgt8s\" (UniqueName: \"kubernetes.io/projected/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-kube-api-access-zgt8s\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.054553 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-proxy-ca-bundles\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.055687 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-proxy-ca-bundles\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.056890 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-config\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.057581 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-client-ca\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.058864 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-client-ca\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.059285 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-config\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.077789 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8k8\" (UniqueName: \"kubernetes.io/projected/be3d67c9-0e01-48ab-8091-2e0bf103655d-kube-api-access-mw8k8\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.079255 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d67c9-0e01-48ab-8091-2e0bf103655d-serving-cert\") pod \"route-controller-manager-6b5767577f-5znp2\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.080179 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgt8s\" (UniqueName: \"kubernetes.io/projected/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-kube-api-access-zgt8s\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.080659 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-serving-cert\") pod \"controller-manager-74548584bf-x85ch\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.106840 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.116131 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.319973 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74548584bf-x85ch"] Mar 12 18:07:29 crc kubenswrapper[4926]: W0312 18:07:29.324748 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5f25640_5527_4caa_a290_cbb3bbfc5e0b.slice/crio-9b5a165d0fdbcf22977eee75175ce438f8c21d2c260b57e0ebed1521f9e89fb5 WatchSource:0}: Error finding container 9b5a165d0fdbcf22977eee75175ce438f8c21d2c260b57e0ebed1521f9e89fb5: Status 404 returned error can't find the container with id 9b5a165d0fdbcf22977eee75175ce438f8c21d2c260b57e0ebed1521f9e89fb5 Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.659724 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2"] Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.816135 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" podUID="eedd886d-5443-47e1-afbf-5aff90067f3b" containerName="oauth-openshift" containerID="cri-o://9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98" gracePeriod=15 Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.856659 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" event={"ID":"a5f25640-5527-4caa-a290-cbb3bbfc5e0b","Type":"ContainerStarted","Data":"4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2"} Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.856927 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" event={"ID":"a5f25640-5527-4caa-a290-cbb3bbfc5e0b","Type":"ContainerStarted","Data":"9b5a165d0fdbcf22977eee75175ce438f8c21d2c260b57e0ebed1521f9e89fb5"} Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.857945 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.873806 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" event={"ID":"be3d67c9-0e01-48ab-8091-2e0bf103655d","Type":"ContainerStarted","Data":"19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128"} Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.874038 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" event={"ID":"be3d67c9-0e01-48ab-8091-2e0bf103655d","Type":"ContainerStarted","Data":"581ff73fc13e04cb81a4a9c9a97934c14adf3bf26fd03837864c6f60ecb97184"} Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.874930 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.878169 4926 patch_prober.go:28] interesting pod/route-controller-manager-6b5767577f-5znp2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.878219 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" podUID="be3d67c9-0e01-48ab-8091-2e0bf103655d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.905986 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.910540 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" podStartSLOduration=2.910526431 podStartE2EDuration="2.910526431s" podCreationTimestamp="2026-03-12 18:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:29.88672585 +0000 UTC m=+290.255352183" watchObservedRunningTime="2026-03-12 18:07:29.910526431 +0000 UTC m=+290.279152754" Mar 12 18:07:29 crc kubenswrapper[4926]: I0312 18:07:29.911758 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" podStartSLOduration=2.91175347 podStartE2EDuration="2.91175347s" podCreationTimestamp="2026-03-12 18:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:29.908780986 +0000 UTC m=+290.277407319" watchObservedRunningTime="2026-03-12 18:07:29.91175347 +0000 UTC m=+290.280379803" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.223481 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379468 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-dir\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379566 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-session\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379573 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379631 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-ocp-branding-template\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379694 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-policies\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379728 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-error\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379763 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-serving-cert\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379798 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-provider-selection\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379832 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-idp-0-file-data\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.379870 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88hg4\" (UniqueName: \"kubernetes.io/projected/eedd886d-5443-47e1-afbf-5aff90067f3b-kube-api-access-88hg4\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.380897 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-router-certs\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.380952 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-cliconfig\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.381005 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-service-ca\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.381056 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-login\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.381099 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-trusted-ca-bundle\") pod \"eedd886d-5443-47e1-afbf-5aff90067f3b\" (UID: \"eedd886d-5443-47e1-afbf-5aff90067f3b\") " Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.381353 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.381507 4926 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.381519 4926 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.382154 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.382256 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.382404 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.405290 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.405726 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.406029 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.407957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.409106 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedd886d-5443-47e1-afbf-5aff90067f3b-kube-api-access-88hg4" (OuterVolumeSpecName: "kube-api-access-88hg4") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "kube-api-access-88hg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.409202 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.409458 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.412718 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.414366 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "eedd886d-5443-47e1-afbf-5aff90067f3b" (UID: "eedd886d-5443-47e1-afbf-5aff90067f3b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482528 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482569 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482579 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482590 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482603 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88hg4\" (UniqueName: \"kubernetes.io/projected/eedd886d-5443-47e1-afbf-5aff90067f3b-kube-api-access-88hg4\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482614 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482623 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482634 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482643 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482654 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482664 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.482675 4926 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eedd886d-5443-47e1-afbf-5aff90067f3b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.505540 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a46d07-aea8-4496-b088-1d30771449a5" path="/var/lib/kubelet/pods/d9a46d07-aea8-4496-b088-1d30771449a5/volumes" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.880003 4926 generic.go:334] "Generic (PLEG): container finished" podID="eedd886d-5443-47e1-afbf-5aff90067f3b" containerID="9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98" exitCode=0 Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.880949 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.881038 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" event={"ID":"eedd886d-5443-47e1-afbf-5aff90067f3b","Type":"ContainerDied","Data":"9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98"} Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.881062 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7rmsn" event={"ID":"eedd886d-5443-47e1-afbf-5aff90067f3b","Type":"ContainerDied","Data":"f375d41ba80f860b9235fe8ebb0d083845e27083ae0a2e004437a84b6b1f824e"} Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.881079 4926 scope.go:117] "RemoveContainer" containerID="9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.885664 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.896887 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7rmsn"] Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.899689 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7rmsn"] Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.903745 4926 scope.go:117] "RemoveContainer" containerID="9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98" Mar 12 18:07:30 crc kubenswrapper[4926]: E0312 18:07:30.904511 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98\": container with ID starting with 9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98 not found: ID does not exist" containerID="9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98" Mar 12 18:07:30 crc kubenswrapper[4926]: I0312 18:07:30.904557 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98"} err="failed to get container status \"9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98\": rpc error: code = NotFound desc = could not find container \"9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98\": container with ID starting with 9eb303537b742a873d0e271421bd5a8464795ffe3ddf357bdfd78ea7386ecc98 not found: ID does not exist" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.479667 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.494870 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.494905 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.537010 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.541271 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.687238 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.687567 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.728971 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.881282 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.881352 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.932171 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.943207 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:07:31 crc kubenswrapper[4926]: I0312 18:07:31.944376 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:07:32 crc kubenswrapper[4926]: I0312 18:07:32.504489 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedd886d-5443-47e1-afbf-5aff90067f3b" path="/var/lib/kubelet/pods/eedd886d-5443-47e1-afbf-5aff90067f3b/volumes" Mar 12 18:07:32 crc kubenswrapper[4926]: I0312 18:07:32.948086 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.555357 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.555428 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.619479 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.904941 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.904999 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.952772 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:07:33 crc kubenswrapper[4926]: I0312 18:07:33.965515 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.129269 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t4sbh"] Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.504729 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.504998 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.896920 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.897049 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.913194 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t4sbh" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="registry-server" containerID="cri-o://2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970" gracePeriod=2 Mar 12 18:07:34 crc kubenswrapper[4926]: I0312 18:07:34.978933 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.351227 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.457893 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-utilities\") pod \"150781c8-5ae3-42a6-b351-2388dfe84167\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.457938 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnjm4\" (UniqueName: \"kubernetes.io/projected/150781c8-5ae3-42a6-b351-2388dfe84167-kube-api-access-tnjm4\") pod \"150781c8-5ae3-42a6-b351-2388dfe84167\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.457963 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-catalog-content\") pod \"150781c8-5ae3-42a6-b351-2388dfe84167\" (UID: \"150781c8-5ae3-42a6-b351-2388dfe84167\") " Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.459188 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-utilities" (OuterVolumeSpecName: "utilities") pod "150781c8-5ae3-42a6-b351-2388dfe84167" (UID: "150781c8-5ae3-42a6-b351-2388dfe84167"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.465693 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.467787 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150781c8-5ae3-42a6-b351-2388dfe84167-kube-api-access-tnjm4" (OuterVolumeSpecName: "kube-api-access-tnjm4") pod "150781c8-5ae3-42a6-b351-2388dfe84167" (UID: "150781c8-5ae3-42a6-b351-2388dfe84167"). InnerVolumeSpecName "kube-api-access-tnjm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.512046 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "150781c8-5ae3-42a6-b351-2388dfe84167" (UID: "150781c8-5ae3-42a6-b351-2388dfe84167"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.550780 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zx9h7" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="registry-server" probeResult="failure" output=< Mar 12 18:07:35 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:07:35 crc kubenswrapper[4926]: > Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.567389 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnjm4\" (UniqueName: \"kubernetes.io/projected/150781c8-5ae3-42a6-b351-2388dfe84167-kube-api-access-tnjm4\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.567461 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/150781c8-5ae3-42a6-b351-2388dfe84167-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.920472 4926 generic.go:334] "Generic (PLEG): container finished" podID="150781c8-5ae3-42a6-b351-2388dfe84167" containerID="2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970" exitCode=0 Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.920721 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerDied","Data":"2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970"} Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.920698 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t4sbh" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.921060 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t4sbh" event={"ID":"150781c8-5ae3-42a6-b351-2388dfe84167","Type":"ContainerDied","Data":"f0734678dc103280ac23f9e4a0964bfd2dcfda20bb5ae8365cb587d30b175b62"} Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.921077 4926 scope.go:117] "RemoveContainer" containerID="2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.926828 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zpnq8"] Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.927072 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zpnq8" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="registry-server" containerID="cri-o://75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff" gracePeriod=2 Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.944586 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hmn6x" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="registry-server" probeResult="failure" output=< Mar 12 18:07:35 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:07:35 crc kubenswrapper[4926]: > Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.944747 4926 scope.go:117] "RemoveContainer" containerID="e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.955127 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t4sbh"] Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.958148 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t4sbh"] Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.967087 4926 scope.go:117] "RemoveContainer" containerID="2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.991279 4926 scope.go:117] "RemoveContainer" containerID="2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970" Mar 12 18:07:35 crc kubenswrapper[4926]: E0312 18:07:35.992699 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970\": container with ID starting with 2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970 not found: ID does not exist" containerID="2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.992734 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970"} err="failed to get container status \"2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970\": rpc error: code = NotFound desc = could not find container \"2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970\": container with ID starting with 2edc573e204b1cec4545170d5c4313584b0e49b6fa8db612c1be47281fb04970 not found: ID does not exist" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.992753 4926 scope.go:117] "RemoveContainer" containerID="e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a" Mar 12 18:07:35 crc kubenswrapper[4926]: E0312 18:07:35.993164 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a\": container with ID starting with e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a not found: ID does not exist" containerID="e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.993231 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a"} err="failed to get container status \"e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a\": rpc error: code = NotFound desc = could not find container \"e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a\": container with ID starting with e271f20c168fa4ef1ea9fdd75f78dc064672cebc67bc3259aae0ce8d79cdc29a not found: ID does not exist" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.993291 4926 scope.go:117] "RemoveContainer" containerID="2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139" Mar 12 18:07:35 crc kubenswrapper[4926]: E0312 18:07:35.993636 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139\": container with ID starting with 2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139 not found: ID does not exist" containerID="2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139" Mar 12 18:07:35 crc kubenswrapper[4926]: I0312 18:07:35.993676 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139"} err="failed to get container status \"2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139\": rpc error: code = NotFound desc = could not find container \"2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139\": container with ID starting with 2262d4d70a418b07a3dcb18723e596af882371729d1b0168b7d72456d55f1139 not found: ID does not exist" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.389064 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.478287 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-utilities\") pod \"743a7318-33d0-4a59-93bd-7c6899554e5e\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.478346 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsdsv\" (UniqueName: \"kubernetes.io/projected/743a7318-33d0-4a59-93bd-7c6899554e5e-kube-api-access-fsdsv\") pod \"743a7318-33d0-4a59-93bd-7c6899554e5e\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.478394 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-catalog-content\") pod \"743a7318-33d0-4a59-93bd-7c6899554e5e\" (UID: \"743a7318-33d0-4a59-93bd-7c6899554e5e\") " Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.480645 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-utilities" (OuterVolumeSpecName: "utilities") pod "743a7318-33d0-4a59-93bd-7c6899554e5e" (UID: "743a7318-33d0-4a59-93bd-7c6899554e5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.481964 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.484815 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743a7318-33d0-4a59-93bd-7c6899554e5e-kube-api-access-fsdsv" (OuterVolumeSpecName: "kube-api-access-fsdsv") pod "743a7318-33d0-4a59-93bd-7c6899554e5e" (UID: "743a7318-33d0-4a59-93bd-7c6899554e5e"). InnerVolumeSpecName "kube-api-access-fsdsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.504149 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" path="/var/lib/kubelet/pods/150781c8-5ae3-42a6-b351-2388dfe84167/volumes" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.526043 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v5qw8"] Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.534287 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "743a7318-33d0-4a59-93bd-7c6899554e5e" (UID: "743a7318-33d0-4a59-93bd-7c6899554e5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.583818 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743a7318-33d0-4a59-93bd-7c6899554e5e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.583872 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsdsv\" (UniqueName: \"kubernetes.io/projected/743a7318-33d0-4a59-93bd-7c6899554e5e-kube-api-access-fsdsv\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.935511 4926 generic.go:334] "Generic (PLEG): container finished" podID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerID="75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff" exitCode=0 Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.935566 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpnq8" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.935612 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpnq8" event={"ID":"743a7318-33d0-4a59-93bd-7c6899554e5e","Type":"ContainerDied","Data":"75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff"} Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.935645 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpnq8" event={"ID":"743a7318-33d0-4a59-93bd-7c6899554e5e","Type":"ContainerDied","Data":"dc92306c9f92008ccb5e4b67f4e37a7020d96c316ccb9bc2ecf40d9ac23d9cf2"} Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.935666 4926 scope.go:117] "RemoveContainer" containerID="75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.935966 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v5qw8" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="registry-server" containerID="cri-o://bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9" gracePeriod=2 Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.960668 4926 scope.go:117] "RemoveContainer" containerID="ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199" Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.974213 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zpnq8"] Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.975394 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zpnq8"] Mar 12 18:07:36 crc kubenswrapper[4926]: I0312 18:07:36.983700 4926 scope.go:117] "RemoveContainer" containerID="80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.061768 4926 scope.go:117] "RemoveContainer" containerID="75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.062471 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff\": container with ID starting with 75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff not found: ID does not exist" containerID="75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.062502 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff"} err="failed to get container status \"75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff\": rpc error: code = NotFound desc = could not find container \"75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff\": container with ID starting with 75d3832859836653ff1e10c1d5dba06996ff45d8969ab164fd1416b7403fd4ff not found: ID does not exist" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.062530 4926 scope.go:117] "RemoveContainer" containerID="ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.062938 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199\": container with ID starting with ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199 not found: ID does not exist" containerID="ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.063027 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199"} err="failed to get container status \"ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199\": rpc error: code = NotFound desc = could not find container \"ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199\": container with ID starting with ebf537f8f44f9ee169e4b4c6b4fc14ae60e34fc19d7bb85795f2dd414478a199 not found: ID does not exist" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.063082 4926 scope.go:117] "RemoveContainer" containerID="80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.063824 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9\": container with ID starting with 80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9 not found: ID does not exist" containerID="80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.063851 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9"} err="failed to get container status \"80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9\": rpc error: code = NotFound desc = could not find container \"80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9\": container with ID starting with 80367076dd605a92b7bdc13d1f48d31cd8288852dc66b581eb496ee2193179d9 not found: ID does not exist" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.427106 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.492906 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-catalog-content\") pod \"8630848f-c268-4f4a-9fd0-8f33765c20b4\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.492977 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-utilities\") pod \"8630848f-c268-4f4a-9fd0-8f33765c20b4\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.493012 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkgrh\" (UniqueName: \"kubernetes.io/projected/8630848f-c268-4f4a-9fd0-8f33765c20b4-kube-api-access-pkgrh\") pod \"8630848f-c268-4f4a-9fd0-8f33765c20b4\" (UID: \"8630848f-c268-4f4a-9fd0-8f33765c20b4\") " Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.494421 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-utilities" (OuterVolumeSpecName: "utilities") pod "8630848f-c268-4f4a-9fd0-8f33765c20b4" (UID: "8630848f-c268-4f4a-9fd0-8f33765c20b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.497740 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8630848f-c268-4f4a-9fd0-8f33765c20b4-kube-api-access-pkgrh" (OuterVolumeSpecName: "kube-api-access-pkgrh") pod "8630848f-c268-4f4a-9fd0-8f33765c20b4" (UID: "8630848f-c268-4f4a-9fd0-8f33765c20b4"). InnerVolumeSpecName "kube-api-access-pkgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.521349 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8630848f-c268-4f4a-9fd0-8f33765c20b4" (UID: "8630848f-c268-4f4a-9fd0-8f33765c20b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.594537 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.594571 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8630848f-c268-4f4a-9fd0-8f33765c20b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.594582 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkgrh\" (UniqueName: \"kubernetes.io/projected/8630848f-c268-4f4a-9fd0-8f33765c20b4-kube-api-access-pkgrh\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.800679 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68755f559b-zw9n6"] Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.801802 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="extract-content" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.801851 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="extract-content" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.801900 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.801919 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.801996 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802015 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802039 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedd886d-5443-47e1-afbf-5aff90067f3b" containerName="oauth-openshift" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802057 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedd886d-5443-47e1-afbf-5aff90067f3b" containerName="oauth-openshift" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802113 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="extract-content" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802131 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="extract-content" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802157 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="extract-content" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802174 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="extract-content" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802212 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="extract-utilities" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802231 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="extract-utilities" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802270 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="extract-utilities" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802288 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="extract-utilities" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802353 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802373 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: E0312 18:07:37.802406 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="extract-utilities" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.802424 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="extract-utilities" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.803184 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.803235 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="150781c8-5ae3-42a6-b351-2388dfe84167" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.803278 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerName="registry-server" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.803302 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedd886d-5443-47e1-afbf-5aff90067f3b" containerName="oauth-openshift" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.804593 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.863085 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.863129 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.863181 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.863219 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.863314 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.863867 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.865047 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.865513 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.869424 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.870931 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.872166 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.872202 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.879539 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.880639 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.889660 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68755f559b-zw9n6"] Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.896951 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.897933 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898004 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp2j4\" (UniqueName: \"kubernetes.io/projected/6504d75a-184b-4809-bdaf-13993983da43-kube-api-access-sp2j4\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898054 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-error\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898077 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898117 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898141 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898171 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6504d75a-184b-4809-bdaf-13993983da43-audit-dir\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898221 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-session\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898254 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-login\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898319 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898337 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898540 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-service-ca\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898575 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-router-certs\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.898613 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-audit-policies\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.948212 4926 generic.go:334] "Generic (PLEG): container finished" podID="8630848f-c268-4f4a-9fd0-8f33765c20b4" containerID="bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9" exitCode=0 Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.948261 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v5qw8" event={"ID":"8630848f-c268-4f4a-9fd0-8f33765c20b4","Type":"ContainerDied","Data":"bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9"} Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.948289 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v5qw8" event={"ID":"8630848f-c268-4f4a-9fd0-8f33765c20b4","Type":"ContainerDied","Data":"e7bb37ba860cfb3ecfc47e3421fdc163d8b676f30a5d56c59a6b61ad57e2cd0c"} Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.948313 4926 scope.go:117] "RemoveContainer" containerID="bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.948506 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v5qw8" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.974910 4926 scope.go:117] "RemoveContainer" containerID="23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999265 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-service-ca\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999313 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-router-certs\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999341 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-audit-policies\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999349 4926 scope.go:117] "RemoveContainer" containerID="ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999365 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999386 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp2j4\" (UniqueName: \"kubernetes.io/projected/6504d75a-184b-4809-bdaf-13993983da43-kube-api-access-sp2j4\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999411 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-error\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999452 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999476 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999469 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v5qw8"] Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999501 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999529 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6504d75a-184b-4809-bdaf-13993983da43-audit-dir\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999556 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-session\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999585 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-login\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999617 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:37 crc kubenswrapper[4926]: I0312 18:07:37.999637 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.000717 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-service-ca\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.000780 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-audit-policies\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.001001 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.001361 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6504d75a-184b-4809-bdaf-13993983da43-audit-dir\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.002716 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.006787 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-error\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.006805 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.008989 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-router-certs\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.009023 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-template-login\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.010160 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v5qw8"] Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.011052 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.012156 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-session\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.013256 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.015047 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6504d75a-184b-4809-bdaf-13993983da43-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.037089 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp2j4\" (UniqueName: \"kubernetes.io/projected/6504d75a-184b-4809-bdaf-13993983da43-kube-api-access-sp2j4\") pod \"oauth-openshift-68755f559b-zw9n6\" (UID: \"6504d75a-184b-4809-bdaf-13993983da43\") " pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.038978 4926 scope.go:117] "RemoveContainer" containerID="bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9" Mar 12 18:07:38 crc kubenswrapper[4926]: E0312 18:07:38.039323 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9\": container with ID starting with bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9 not found: ID does not exist" containerID="bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.039371 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9"} err="failed to get container status \"bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9\": rpc error: code = NotFound desc = could not find container \"bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9\": container with ID starting with bab8d0e3055afeafd4a0d6f6f1f368c0c90798b28e54a364754943c7018fc4a9 not found: ID does not exist" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.039402 4926 scope.go:117] "RemoveContainer" containerID="23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0" Mar 12 18:07:38 crc kubenswrapper[4926]: E0312 18:07:38.039936 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0\": container with ID starting with 23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0 not found: ID does not exist" containerID="23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.039969 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0"} err="failed to get container status \"23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0\": rpc error: code = NotFound desc = could not find container \"23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0\": container with ID starting with 23e4b185b3be270175429e9780ce6edd9c856427857e8483640952f1aea32ee0 not found: ID does not exist" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.039990 4926 scope.go:117] "RemoveContainer" containerID="ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3" Mar 12 18:07:38 crc kubenswrapper[4926]: E0312 18:07:38.040174 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3\": container with ID starting with ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3 not found: ID does not exist" containerID="ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.040204 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3"} err="failed to get container status \"ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3\": rpc error: code = NotFound desc = could not find container \"ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3\": container with ID starting with ed7f7979c7884d3e906be4f3ddd912adf35a0794bb60ca54cf1534b508fe81a3 not found: ID does not exist" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.186818 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.498381 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743a7318-33d0-4a59-93bd-7c6899554e5e" path="/var/lib/kubelet/pods/743a7318-33d0-4a59-93bd-7c6899554e5e/volumes" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.499070 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8630848f-c268-4f4a-9fd0-8f33765c20b4" path="/var/lib/kubelet/pods/8630848f-c268-4f4a-9fd0-8f33765c20b4/volumes" Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.632355 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68755f559b-zw9n6"] Mar 12 18:07:38 crc kubenswrapper[4926]: W0312 18:07:38.637485 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6504d75a_184b_4809_bdaf_13993983da43.slice/crio-5a3d486791cdcdc8b87a6bcc221dcab256a616945fc43dc009396dd8982ea0cd WatchSource:0}: Error finding container 5a3d486791cdcdc8b87a6bcc221dcab256a616945fc43dc009396dd8982ea0cd: Status 404 returned error can't find the container with id 5a3d486791cdcdc8b87a6bcc221dcab256a616945fc43dc009396dd8982ea0cd Mar 12 18:07:38 crc kubenswrapper[4926]: I0312 18:07:38.956272 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" event={"ID":"6504d75a-184b-4809-bdaf-13993983da43","Type":"ContainerStarted","Data":"5a3d486791cdcdc8b87a6bcc221dcab256a616945fc43dc009396dd8982ea0cd"} Mar 12 18:07:39 crc kubenswrapper[4926]: I0312 18:07:39.966289 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" event={"ID":"6504d75a-184b-4809-bdaf-13993983da43","Type":"ContainerStarted","Data":"87287b9d71746c3893271027c2a3f96b1867e2e5354267f2c6fb5b64c9c65375"} Mar 12 18:07:39 crc kubenswrapper[4926]: I0312 18:07:39.966680 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:39 crc kubenswrapper[4926]: I0312 18:07:39.975901 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" Mar 12 18:07:39 crc kubenswrapper[4926]: I0312 18:07:39.995959 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68755f559b-zw9n6" podStartSLOduration=35.995930905 podStartE2EDuration="35.995930905s" podCreationTimestamp="2026-03-12 18:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:39.992329082 +0000 UTC m=+300.360955445" watchObservedRunningTime="2026-03-12 18:07:39.995930905 +0000 UTC m=+300.364557268" Mar 12 18:07:44 crc kubenswrapper[4926]: I0312 18:07:44.554335 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:07:44 crc kubenswrapper[4926]: I0312 18:07:44.600659 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:07:44 crc kubenswrapper[4926]: I0312 18:07:44.940491 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:07:44 crc kubenswrapper[4926]: I0312 18:07:44.982667 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.163563 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74548584bf-x85ch"] Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.164084 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" podUID="a5f25640-5527-4caa-a290-cbb3bbfc5e0b" containerName="controller-manager" containerID="cri-o://4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2" gracePeriod=30 Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.248547 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2"] Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.248774 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" podUID="be3d67c9-0e01-48ab-8091-2e0bf103655d" containerName="route-controller-manager" containerID="cri-o://19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128" gracePeriod=30 Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.663622 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.670938 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725427 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-config\") pod \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725548 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-serving-cert\") pod \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725578 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgt8s\" (UniqueName: \"kubernetes.io/projected/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-kube-api-access-zgt8s\") pod \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725603 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-proxy-ca-bundles\") pod \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725629 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8k8\" (UniqueName: \"kubernetes.io/projected/be3d67c9-0e01-48ab-8091-2e0bf103655d-kube-api-access-mw8k8\") pod \"be3d67c9-0e01-48ab-8091-2e0bf103655d\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725650 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-config\") pod \"be3d67c9-0e01-48ab-8091-2e0bf103655d\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725682 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-client-ca\") pod \"be3d67c9-0e01-48ab-8091-2e0bf103655d\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725697 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-client-ca\") pod \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\" (UID: \"a5f25640-5527-4caa-a290-cbb3bbfc5e0b\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.725713 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d67c9-0e01-48ab-8091-2e0bf103655d-serving-cert\") pod \"be3d67c9-0e01-48ab-8091-2e0bf103655d\" (UID: \"be3d67c9-0e01-48ab-8091-2e0bf103655d\") " Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.726353 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a5f25640-5527-4caa-a290-cbb3bbfc5e0b" (UID: "a5f25640-5527-4caa-a290-cbb3bbfc5e0b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.726876 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a5f25640-5527-4caa-a290-cbb3bbfc5e0b" (UID: "a5f25640-5527-4caa-a290-cbb3bbfc5e0b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.727182 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-client-ca" (OuterVolumeSpecName: "client-ca") pod "be3d67c9-0e01-48ab-8091-2e0bf103655d" (UID: "be3d67c9-0e01-48ab-8091-2e0bf103655d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.727294 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-config" (OuterVolumeSpecName: "config") pod "be3d67c9-0e01-48ab-8091-2e0bf103655d" (UID: "be3d67c9-0e01-48ab-8091-2e0bf103655d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.727328 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-config" (OuterVolumeSpecName: "config") pod "a5f25640-5527-4caa-a290-cbb3bbfc5e0b" (UID: "a5f25640-5527-4caa-a290-cbb3bbfc5e0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.731133 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3d67c9-0e01-48ab-8091-2e0bf103655d-kube-api-access-mw8k8" (OuterVolumeSpecName: "kube-api-access-mw8k8") pod "be3d67c9-0e01-48ab-8091-2e0bf103655d" (UID: "be3d67c9-0e01-48ab-8091-2e0bf103655d"). InnerVolumeSpecName "kube-api-access-mw8k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.731202 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3d67c9-0e01-48ab-8091-2e0bf103655d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be3d67c9-0e01-48ab-8091-2e0bf103655d" (UID: "be3d67c9-0e01-48ab-8091-2e0bf103655d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.731284 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-kube-api-access-zgt8s" (OuterVolumeSpecName: "kube-api-access-zgt8s") pod "a5f25640-5527-4caa-a290-cbb3bbfc5e0b" (UID: "a5f25640-5527-4caa-a290-cbb3bbfc5e0b"). InnerVolumeSpecName "kube-api-access-zgt8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.732143 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a5f25640-5527-4caa-a290-cbb3bbfc5e0b" (UID: "a5f25640-5527-4caa-a290-cbb3bbfc5e0b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827544 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827590 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d67c9-0e01-48ab-8091-2e0bf103655d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827603 4926 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827614 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d67c9-0e01-48ab-8091-2e0bf103655d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827628 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827637 4926 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827649 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgt8s\" (UniqueName: \"kubernetes.io/projected/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-kube-api-access-zgt8s\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827665 4926 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5f25640-5527-4caa-a290-cbb3bbfc5e0b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:47 crc kubenswrapper[4926]: I0312 18:07:47.827675 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8k8\" (UniqueName: \"kubernetes.io/projected/be3d67c9-0e01-48ab-8091-2e0bf103655d-kube-api-access-mw8k8\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.016159 4926 generic.go:334] "Generic (PLEG): container finished" podID="a5f25640-5527-4caa-a290-cbb3bbfc5e0b" containerID="4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2" exitCode=0 Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.016269 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.016266 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" event={"ID":"a5f25640-5527-4caa-a290-cbb3bbfc5e0b","Type":"ContainerDied","Data":"4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2"} Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.016969 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74548584bf-x85ch" event={"ID":"a5f25640-5527-4caa-a290-cbb3bbfc5e0b","Type":"ContainerDied","Data":"9b5a165d0fdbcf22977eee75175ce438f8c21d2c260b57e0ebed1521f9e89fb5"} Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.017006 4926 scope.go:117] "RemoveContainer" containerID="4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.018921 4926 generic.go:334] "Generic (PLEG): container finished" podID="be3d67c9-0e01-48ab-8091-2e0bf103655d" containerID="19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128" exitCode=0 Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.018976 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" event={"ID":"be3d67c9-0e01-48ab-8091-2e0bf103655d","Type":"ContainerDied","Data":"19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128"} Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.019041 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" event={"ID":"be3d67c9-0e01-48ab-8091-2e0bf103655d","Type":"ContainerDied","Data":"581ff73fc13e04cb81a4a9c9a97934c14adf3bf26fd03837864c6f60ecb97184"} Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.019242 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.049462 4926 scope.go:117] "RemoveContainer" containerID="4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2" Mar 12 18:07:48 crc kubenswrapper[4926]: E0312 18:07:48.050127 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2\": container with ID starting with 4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2 not found: ID does not exist" containerID="4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.050166 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2"} err="failed to get container status \"4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2\": rpc error: code = NotFound desc = could not find container \"4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2\": container with ID starting with 4549267ab7af25c6d9ee4556186f3dd19ae56cb72fc8f13eb3af40054eada3f2 not found: ID does not exist" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.050190 4926 scope.go:117] "RemoveContainer" containerID="19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.062321 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74548584bf-x85ch"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.065419 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74548584bf-x85ch"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.073949 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.077591 4926 scope.go:117] "RemoveContainer" containerID="19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128" Mar 12 18:07:48 crc kubenswrapper[4926]: E0312 18:07:48.078276 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128\": container with ID starting with 19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128 not found: ID does not exist" containerID="19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.078334 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128"} err="failed to get container status \"19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128\": rpc error: code = NotFound desc = could not find container \"19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128\": container with ID starting with 19399e69849a8c08401404aad3ecf5af11661e7750f96f5570cd0700cde99128 not found: ID does not exist" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.079487 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b5767577f-5znp2"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.500239 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f25640-5527-4caa-a290-cbb3bbfc5e0b" path="/var/lib/kubelet/pods/a5f25640-5527-4caa-a290-cbb3bbfc5e0b/volumes" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.501090 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3d67c9-0e01-48ab-8091-2e0bf103655d" path="/var/lib/kubelet/pods/be3d67c9-0e01-48ab-8091-2e0bf103655d/volumes" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.821599 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cfc589789-g48s2"] Mar 12 18:07:48 crc kubenswrapper[4926]: E0312 18:07:48.821885 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f25640-5527-4caa-a290-cbb3bbfc5e0b" containerName="controller-manager" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.821899 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f25640-5527-4caa-a290-cbb3bbfc5e0b" containerName="controller-manager" Mar 12 18:07:48 crc kubenswrapper[4926]: E0312 18:07:48.821941 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3d67c9-0e01-48ab-8091-2e0bf103655d" containerName="route-controller-manager" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.821950 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3d67c9-0e01-48ab-8091-2e0bf103655d" containerName="route-controller-manager" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.822054 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f25640-5527-4caa-a290-cbb3bbfc5e0b" containerName="controller-manager" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.822066 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3d67c9-0e01-48ab-8091-2e0bf103655d" containerName="route-controller-manager" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.822534 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.827188 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.827959 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.828618 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.828688 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.828855 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.829997 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.832881 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.834608 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.847200 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.847595 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cfc589789-g48s2"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.848214 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.848492 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9hgc\" (UniqueName: \"kubernetes.io/projected/439f4a23-9447-4999-b30e-7aa07f2b1553-kube-api-access-w9hgc\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.848754 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f4a23-9447-4999-b30e-7aa07f2b1553-serving-cert\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.849124 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx6w\" (UniqueName: \"kubernetes.io/projected/0bcd8e23-295b-4e93-a842-3e25d10c10ef-kube-api-access-dbx6w\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.849368 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcd8e23-295b-4e93-a842-3e25d10c10ef-client-ca\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.849612 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-config\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.849884 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bcd8e23-295b-4e93-a842-3e25d10c10ef-serving-cert\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.850067 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-client-ca\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.850414 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcd8e23-295b-4e93-a842-3e25d10c10ef-config\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.849701 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.850606 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.850661 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-proxy-ca-bundles\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.851597 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.853619 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.862374 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.862560 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.927072 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6x"] Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.927402 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hmn6x" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="registry-server" containerID="cri-o://9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167" gracePeriod=2 Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.955674 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx6w\" (UniqueName: \"kubernetes.io/projected/0bcd8e23-295b-4e93-a842-3e25d10c10ef-kube-api-access-dbx6w\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.956046 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcd8e23-295b-4e93-a842-3e25d10c10ef-client-ca\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.956182 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-config\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.957032 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0bcd8e23-295b-4e93-a842-3e25d10c10ef-client-ca\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.957603 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bcd8e23-295b-4e93-a842-3e25d10c10ef-serving-cert\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.957655 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-client-ca\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.957679 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcd8e23-295b-4e93-a842-3e25d10c10ef-config\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.958061 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-proxy-ca-bundles\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.958458 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-config\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.959745 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9hgc\" (UniqueName: \"kubernetes.io/projected/439f4a23-9447-4999-b30e-7aa07f2b1553-kube-api-access-w9hgc\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.959802 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f4a23-9447-4999-b30e-7aa07f2b1553-serving-cert\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.960247 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcd8e23-295b-4e93-a842-3e25d10c10ef-config\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.960262 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-client-ca\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.960286 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/439f4a23-9447-4999-b30e-7aa07f2b1553-proxy-ca-bundles\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.965216 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bcd8e23-295b-4e93-a842-3e25d10c10ef-serving-cert\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.965949 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/439f4a23-9447-4999-b30e-7aa07f2b1553-serving-cert\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.979591 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx6w\" (UniqueName: \"kubernetes.io/projected/0bcd8e23-295b-4e93-a842-3e25d10c10ef-kube-api-access-dbx6w\") pod \"route-controller-manager-5499f549c8-qpmc8\" (UID: \"0bcd8e23-295b-4e93-a842-3e25d10c10ef\") " pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:48 crc kubenswrapper[4926]: I0312 18:07:48.988752 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9hgc\" (UniqueName: \"kubernetes.io/projected/439f4a23-9447-4999-b30e-7aa07f2b1553-kube-api-access-w9hgc\") pod \"controller-manager-cfc589789-g48s2\" (UID: \"439f4a23-9447-4999-b30e-7aa07f2b1553\") " pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.157115 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.166633 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.347311 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.368371 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-catalog-content\") pod \"1f425571-9ce5-4fdc-9631-7683efa292aa\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.368484 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-utilities\") pod \"1f425571-9ce5-4fdc-9631-7683efa292aa\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.368550 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4msz\" (UniqueName: \"kubernetes.io/projected/1f425571-9ce5-4fdc-9631-7683efa292aa-kube-api-access-v4msz\") pod \"1f425571-9ce5-4fdc-9631-7683efa292aa\" (UID: \"1f425571-9ce5-4fdc-9631-7683efa292aa\") " Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.369604 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-utilities" (OuterVolumeSpecName: "utilities") pod "1f425571-9ce5-4fdc-9631-7683efa292aa" (UID: "1f425571-9ce5-4fdc-9631-7683efa292aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.375658 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f425571-9ce5-4fdc-9631-7683efa292aa-kube-api-access-v4msz" (OuterVolumeSpecName: "kube-api-access-v4msz") pod "1f425571-9ce5-4fdc-9631-7683efa292aa" (UID: "1f425571-9ce5-4fdc-9631-7683efa292aa"). InnerVolumeSpecName "kube-api-access-v4msz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.406661 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8"] Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.470315 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.470345 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4msz\" (UniqueName: \"kubernetes.io/projected/1f425571-9ce5-4fdc-9631-7683efa292aa-kube-api-access-v4msz\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.503676 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f425571-9ce5-4fdc-9631-7683efa292aa" (UID: "1f425571-9ce5-4fdc-9631-7683efa292aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.571818 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f425571-9ce5-4fdc-9631-7683efa292aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:49 crc kubenswrapper[4926]: I0312 18:07:49.660874 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cfc589789-g48s2"] Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.034859 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" event={"ID":"439f4a23-9447-4999-b30e-7aa07f2b1553","Type":"ContainerStarted","Data":"c6cbe8ef6255982af6256393e3246146b85748ebe1583b5f0e9e9a8d5abccf68"} Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.034926 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" event={"ID":"439f4a23-9447-4999-b30e-7aa07f2b1553","Type":"ContainerStarted","Data":"371829af9ec1fd277ceecbaffa5e067453dc8db27bb4cad207e9ce29d2560cda"} Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.035254 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.037149 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" event={"ID":"0bcd8e23-295b-4e93-a842-3e25d10c10ef","Type":"ContainerStarted","Data":"db0fe8825514367fe787d924830623f1474b74857c21979b15bf4cb6d0ccb0ed"} Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.037186 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" event={"ID":"0bcd8e23-295b-4e93-a842-3e25d10c10ef","Type":"ContainerStarted","Data":"46ea61a8c49a4b2f157f901e95f006ea70fa8ba384f61f1b6661bfc3e219c26c"} Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.037371 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.054689 4926 generic.go:334] "Generic (PLEG): container finished" podID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerID="9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167" exitCode=0 Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.054752 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerDied","Data":"9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167"} Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.054787 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmn6x" event={"ID":"1f425571-9ce5-4fdc-9631-7683efa292aa","Type":"ContainerDied","Data":"d295389691829deab9da8e29d0c26dfd809e13bacff8864270d0d1a8c5b14dcd"} Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.054813 4926 scope.go:117] "RemoveContainer" containerID="9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.054815 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmn6x" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.068821 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.091674 4926 scope.go:117] "RemoveContainer" containerID="9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.115633 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cfc589789-g48s2" podStartSLOduration=3.115610955 podStartE2EDuration="3.115610955s" podCreationTimestamp="2026-03-12 18:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:50.081279221 +0000 UTC m=+310.449905554" watchObservedRunningTime="2026-03-12 18:07:50.115610955 +0000 UTC m=+310.484237288" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.129023 4926 scope.go:117] "RemoveContainer" containerID="3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.153477 4926 scope.go:117] "RemoveContainer" containerID="9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167" Mar 12 18:07:50 crc kubenswrapper[4926]: E0312 18:07:50.155842 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167\": container with ID starting with 9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167 not found: ID does not exist" containerID="9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.155890 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167"} err="failed to get container status \"9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167\": rpc error: code = NotFound desc = could not find container \"9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167\": container with ID starting with 9953fa9c8ae4779c66f50a9da2e5055796da72263bfafc670cc521d8eb8da167 not found: ID does not exist" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.155915 4926 scope.go:117] "RemoveContainer" containerID="9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429" Mar 12 18:07:50 crc kubenswrapper[4926]: E0312 18:07:50.156226 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429\": container with ID starting with 9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429 not found: ID does not exist" containerID="9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.156313 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429"} err="failed to get container status \"9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429\": rpc error: code = NotFound desc = could not find container \"9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429\": container with ID starting with 9a932193e2b455fb9379d7e34c8f9171536a3e43595b8c2b4822103bfb25a429 not found: ID does not exist" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.156380 4926 scope.go:117] "RemoveContainer" containerID="3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a" Mar 12 18:07:50 crc kubenswrapper[4926]: E0312 18:07:50.159131 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a\": container with ID starting with 3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a not found: ID does not exist" containerID="3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.159306 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a"} err="failed to get container status \"3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a\": rpc error: code = NotFound desc = could not find container \"3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a\": container with ID starting with 3e95babad73f5786563a2d69b70a26fb937a9b6ccb630b521fa9357c578cba1a not found: ID does not exist" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.162760 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" podStartSLOduration=3.162747862 podStartE2EDuration="3.162747862s" podCreationTimestamp="2026-03-12 18:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:07:50.160668946 +0000 UTC m=+310.529295289" watchObservedRunningTime="2026-03-12 18:07:50.162747862 +0000 UTC m=+310.531374195" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.186934 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6x"] Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.190660 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hmn6x"] Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.198199 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5499f549c8-qpmc8" Mar 12 18:07:50 crc kubenswrapper[4926]: I0312 18:07:50.495325 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" path="/var/lib/kubelet/pods/1f425571-9ce5-4fdc-9631-7683efa292aa/volumes" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.504717 4926 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.505203 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="extract-content" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.505218 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="extract-content" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.505231 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="registry-server" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.505239 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="registry-server" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.505257 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="extract-utilities" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.505266 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="extract-utilities" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.505404 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f425571-9ce5-4fdc-9631-7683efa292aa" containerName="registry-server" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.505827 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.506481 4926 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.506808 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b" gracePeriod=15 Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.506876 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808" gracePeriod=15 Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.506895 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319" gracePeriod=15 Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.506945 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4" gracePeriod=15 Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.506991 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882" gracePeriod=15 Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508643 4926 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508783 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508793 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508805 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508815 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508825 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508833 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508844 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508852 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508861 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508868 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508879 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508888 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508898 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508906 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508917 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508925 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.508936 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.508945 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509047 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509058 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509068 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509079 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509090 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509099 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509109 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509119 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.509222 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509231 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.509369 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.523563 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.523682 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.523727 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.523968 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.524125 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.524222 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.524292 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.524369 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.559219 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.626944 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627514 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627589 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627618 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627638 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627664 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627681 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627707 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627782 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627827 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627851 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627873 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627905 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627927 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.627953 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.628022 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: I0312 18:07:53.855715 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:07:53 crc kubenswrapper[4926]: E0312 18:07:53.875952 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c2a4f6bb58173 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:07:53.874973043 +0000 UTC m=+314.243599386,LastTimestamp:2026-03-12 18:07:53.874973043 +0000 UTC m=+314.243599386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.077624 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"66936a30ed9b3cad6731e6f16633aaba714831297c3f40433f5955a13bdac350"} Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.080215 4926 generic.go:334] "Generic (PLEG): container finished" podID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" containerID="988893c573b728d62d468c803dd6aa2333147e0f68cf77f807942af0ae0d7912" exitCode=0 Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.080267 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f","Type":"ContainerDied","Data":"988893c573b728d62d468c803dd6aa2333147e0f68cf77f807942af0ae0d7912"} Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.081254 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.081726 4926 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.081875 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.082105 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.083373 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.084020 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319" exitCode=0 Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.084043 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882" exitCode=0 Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.084052 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808" exitCode=0 Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.084063 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4" exitCode=2 Mar 12 18:07:54 crc kubenswrapper[4926]: I0312 18:07:54.084104 4926 scope.go:117] "RemoveContainer" containerID="21b030cc5a2a69caa8b11ed4cd0b4872399a6d989caa30ccaae9b5dcf68e5eab" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.094109 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.097922 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d"} Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.098630 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.100144 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.448862 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.449790 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.450303 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.552956 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kubelet-dir\") pod \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.553055 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" (UID: "63c14bbd-6ba6-42c2-9e94-bf3a6f68500f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.553068 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kube-api-access\") pod \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.553148 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-var-lock\") pod \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\" (UID: \"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f\") " Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.553387 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-var-lock" (OuterVolumeSpecName: "var-lock") pod "63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" (UID: "63c14bbd-6ba6-42c2-9e94-bf3a6f68500f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.553522 4926 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.561838 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" (UID: "63c14bbd-6ba6-42c2-9e94-bf3a6f68500f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.655599 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:55 crc kubenswrapper[4926]: I0312 18:07:55.656105 4926 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/63c14bbd-6ba6-42c2-9e94-bf3a6f68500f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.110853 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.114187 4926 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b" exitCode=0 Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.117775 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.117849 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"63c14bbd-6ba6-42c2-9e94-bf3a6f68500f","Type":"ContainerDied","Data":"491a64d2c203ce09a50339566c321ae1dad16f74cdb93b33b26da5d1f82b4544"} Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.117922 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491a64d2c203ce09a50339566c321ae1dad16f74cdb93b33b26da5d1f82b4544" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.147477 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.148406 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.424112 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.425211 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.425818 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.427374 4926 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.427912 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468034 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468099 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468131 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468194 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468224 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468345 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468777 4926 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468821 4926 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.468842 4926 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:07:56 crc kubenswrapper[4926]: I0312 18:07:56.497762 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 12 18:07:56 crc kubenswrapper[4926]: E0312 18:07:56.541603 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c2a4f6bb58173 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:07:53.874973043 +0000 UTC m=+314.243599386,LastTimestamp:2026-03-12 18:07:53.874973043 +0000 UTC m=+314.243599386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.126461 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.127370 4926 scope.go:117] "RemoveContainer" containerID="3d571ee532648a1c75519f9efc8effdd164f98979e9ab9d53610c940b0200319" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.127548 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.128142 4926 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.128890 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.129193 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.131111 4926 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.131418 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.131805 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.141974 4926 scope.go:117] "RemoveContainer" containerID="0905b9bfa33c6d9a362e6ef466636752b4bfa285f62a2376b909d31a96731882" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.158349 4926 scope.go:117] "RemoveContainer" containerID="077afd62dc8a90f869f162035ff5e84edf160a3105daf1a871634d10d13f9808" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.173281 4926 scope.go:117] "RemoveContainer" containerID="30d6c9fcb202b72e8db949c2ad4c55ba835f1e2b6f10db291360256e47884bf4" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.189337 4926 scope.go:117] "RemoveContainer" containerID="9c94763fccdf84de07158693ecdaf7781b6d2e8deae62cb9bd0d2bc8ffbddd7b" Mar 12 18:07:57 crc kubenswrapper[4926]: I0312 18:07:57.207414 4926 scope.go:117] "RemoveContainer" containerID="274f012f33be81fa551881a3695deba9e4195c945b1af81f04f11d56d1a3ea17" Mar 12 18:08:00 crc kubenswrapper[4926]: I0312 18:08:00.492585 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:00 crc kubenswrapper[4926]: I0312 18:08:00.494224 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.186856 4926 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.187549 4926 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.187875 4926 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.188193 4926 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.188668 4926 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:02 crc kubenswrapper[4926]: I0312 18:08:02.188751 4926 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.189135 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="200ms" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.390193 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="400ms" Mar 12 18:08:02 crc kubenswrapper[4926]: E0312 18:08:02.791633 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="800ms" Mar 12 18:08:03 crc kubenswrapper[4926]: E0312 18:08:03.593565 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="1.6s" Mar 12 18:08:05 crc kubenswrapper[4926]: E0312 18:08:05.194508 4926 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="3.2s" Mar 12 18:08:05 crc kubenswrapper[4926]: I0312 18:08:05.489059 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:05 crc kubenswrapper[4926]: I0312 18:08:05.490130 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:05 crc kubenswrapper[4926]: I0312 18:08:05.490595 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:05 crc kubenswrapper[4926]: I0312 18:08:05.507163 4926 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:05 crc kubenswrapper[4926]: I0312 18:08:05.507678 4926 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:05 crc kubenswrapper[4926]: E0312 18:08:05.508346 4926 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:05 crc kubenswrapper[4926]: I0312 18:08:05.508977 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.193067 4926 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="94e0bc894b14dc9ff48b9251eaa507b5e6745e7b7b2af0a3fca692cb764bfb7d" exitCode=0 Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.193154 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"94e0bc894b14dc9ff48b9251eaa507b5e6745e7b7b2af0a3fca692cb764bfb7d"} Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.193195 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2a62f4e45010887aed487bd146dda0ec82a48298f0bb54b58f1a02b9e6c8235b"} Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.193631 4926 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.193647 4926 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.194433 4926 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:06 crc kubenswrapper[4926]: E0312 18:08:06.194510 4926 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:06 crc kubenswrapper[4926]: I0312 18:08:06.194864 4926 status_manager.go:851] "Failed to get status for pod" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Mar 12 18:08:06 crc kubenswrapper[4926]: E0312 18:08:06.543788 4926 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c2a4f6bb58173 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 18:07:53.874973043 +0000 UTC m=+314.243599386,LastTimestamp:2026-03-12 18:07:53.874973043 +0000 UTC m=+314.243599386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.218108 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.220676 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.220728 4926 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc" exitCode=1 Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.220807 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc"} Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.221202 4926 scope.go:117] "RemoveContainer" containerID="0476f8d782654dd0fc67232862117105145e03f9a9495fb7b7015c28dac5c4dc" Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.224561 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1075a64629b41faf6d84d7d274a4f2a3a4d4ea1b331e23003a01b3929fe7f326"} Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.224610 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"42d1a9d7ce46fada349e899645be2abf96f39c48643131be836bb1e6ba317eb8"} Mar 12 18:08:07 crc kubenswrapper[4926]: I0312 18:08:07.224625 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c9f907cab99bf962950a3b0868cccac7d05aa470a5a7ee87e0adc217fbed488"} Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.233098 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.234861 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.234937 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5f31c459da9f76e8cc69a726d7ff85128963c66fb1d5ef95af8a6621a8c3286"} Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.238087 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ceee911b542f90d3c7df26ffa0b59ee92dfa501fe1fc246b3066939c805fecfd"} Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.238419 4926 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.238490 4926 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.238784 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d3d56189bb24e1bd56ef327ad4e2750cc591d838202e17d442d7bf7872fb8746"} Mar 12 18:08:08 crc kubenswrapper[4926]: I0312 18:08:08.238822 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:09 crc kubenswrapper[4926]: I0312 18:08:09.324183 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:08:10 crc kubenswrapper[4926]: I0312 18:08:10.509330 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:10 crc kubenswrapper[4926]: I0312 18:08:10.509361 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:10 crc kubenswrapper[4926]: I0312 18:08:10.516764 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:12 crc kubenswrapper[4926]: I0312 18:08:12.438978 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:08:12 crc kubenswrapper[4926]: I0312 18:08:12.439512 4926 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 12 18:08:12 crc kubenswrapper[4926]: I0312 18:08:12.439608 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 12 18:08:13 crc kubenswrapper[4926]: I0312 18:08:13.270523 4926 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:14 crc kubenswrapper[4926]: I0312 18:08:14.276401 4926 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:14 crc kubenswrapper[4926]: I0312 18:08:14.276717 4926 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:14 crc kubenswrapper[4926]: I0312 18:08:14.280407 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:14 crc kubenswrapper[4926]: I0312 18:08:14.283190 4926 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="34d15346-5887-460e-8af4-28f9329010dd" Mar 12 18:08:15 crc kubenswrapper[4926]: I0312 18:08:15.282095 4926 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:15 crc kubenswrapper[4926]: I0312 18:08:15.282158 4926 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="aeb621bb-05ee-456b-b869-1cdd14184ad1" Mar 12 18:08:20 crc kubenswrapper[4926]: I0312 18:08:20.520158 4926 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="34d15346-5887-460e-8af4-28f9329010dd" Mar 12 18:08:22 crc kubenswrapper[4926]: I0312 18:08:22.135629 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 18:08:22 crc kubenswrapper[4926]: I0312 18:08:22.445488 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:08:22 crc kubenswrapper[4926]: I0312 18:08:22.453311 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 18:08:22 crc kubenswrapper[4926]: I0312 18:08:22.960384 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 18:08:23 crc kubenswrapper[4926]: I0312 18:08:23.710113 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 18:08:23 crc kubenswrapper[4926]: I0312 18:08:23.753630 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.071087 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.349672 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.456714 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.638165 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.673287 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.718933 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.777401 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 18:08:24 crc kubenswrapper[4926]: I0312 18:08:24.937374 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 18:08:25 crc kubenswrapper[4926]: I0312 18:08:25.013003 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 18:08:25 crc kubenswrapper[4926]: I0312 18:08:25.209140 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 18:08:25 crc kubenswrapper[4926]: I0312 18:08:25.596951 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 18:08:25 crc kubenswrapper[4926]: I0312 18:08:25.862591 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 18:08:25 crc kubenswrapper[4926]: I0312 18:08:25.888256 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 18:08:25 crc kubenswrapper[4926]: I0312 18:08:25.975221 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.060854 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.179525 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.195996 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.212956 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.389690 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.401120 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.528720 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.761796 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 18:08:26 crc kubenswrapper[4926]: I0312 18:08:26.990941 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.001897 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.043847 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.060242 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.378493 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.497983 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.664406 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.709465 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.744197 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.782347 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.803861 4926 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.805068 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 18:08:27 crc kubenswrapper[4926]: I0312 18:08:27.843885 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.080787 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.146298 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.192257 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.310551 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.403796 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.435585 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.468757 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.485703 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.509427 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.642791 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.760230 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.761530 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.773770 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.776681 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.783352 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.802091 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.822343 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.854694 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.902565 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 18:08:28 crc kubenswrapper[4926]: I0312 18:08:28.935750 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.049220 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.067136 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.111327 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.122721 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.283290 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.283939 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.318817 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.335282 4926 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.341804 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.354677 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.475862 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.537687 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.593882 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.665265 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.674777 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.861869 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 18:08:29 crc kubenswrapper[4926]: I0312 18:08:29.942571 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.019410 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.029019 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.052454 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.059891 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.172241 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.184224 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.203389 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.252472 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.264205 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.347300 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.364580 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.388624 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.466600 4926 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.467869 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.46785532 podStartE2EDuration="37.46785532s" podCreationTimestamp="2026-03-12 18:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:08:12.933649342 +0000 UTC m=+333.302275685" watchObservedRunningTime="2026-03-12 18:08:30.46785532 +0000 UTC m=+350.836481653" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.470839 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.470885 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.475388 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.487252 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.48723427 podStartE2EDuration="17.48723427s" podCreationTimestamp="2026-03-12 18:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:08:30.48657967 +0000 UTC m=+350.855206013" watchObservedRunningTime="2026-03-12 18:08:30.48723427 +0000 UTC m=+350.855860603" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.526452 4926 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.537614 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.643997 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.664317 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.689498 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.762539 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.811532 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.851614 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.974360 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 18:08:30 crc kubenswrapper[4926]: I0312 18:08:30.977259 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.030194 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.034587 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.167709 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.174611 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.192185 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.250584 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.360673 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.422076 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.450019 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.455790 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.457032 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.479532 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.517215 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.517554 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.524283 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.539944 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.543162 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.550790 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.635554 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.799433 4926 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.799655 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.804160 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.828056 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.837494 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.893392 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 18:08:31 crc kubenswrapper[4926]: I0312 18:08:31.997375 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.146141 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.196667 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.244546 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.305517 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.328090 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.369760 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.381359 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.420240 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.427285 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.463304 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.481939 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.491728 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.513966 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 18:08:32 crc kubenswrapper[4926]: I0312 18:08:32.685760 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.002580 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.013742 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.055415 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.195353 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.281217 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.304348 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.372490 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.385887 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.436363 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.440631 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.511662 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.521293 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.618110 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.620694 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.703763 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.733523 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.753188 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 18:08:33 crc kubenswrapper[4926]: I0312 18:08:33.809154 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.044202 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.083838 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.092484 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.164484 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.281424 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.293076 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.296240 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.319055 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.319425 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.323379 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.329143 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.439902 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.485825 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.549936 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.621992 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.636307 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.670051 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.703149 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.805911 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.807129 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 18:08:34 crc kubenswrapper[4926]: I0312 18:08:34.954996 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.009589 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.081860 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.188185 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.324837 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.354960 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.390831 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.472127 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.517956 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.624854 4926 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.625171 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d" gracePeriod=5 Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.666834 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.762309 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.795678 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 18:08:35 crc kubenswrapper[4926]: I0312 18:08:35.981857 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.050596 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.097050 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.159855 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.197471 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.326517 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.345194 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.349687 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.369529 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.436768 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.552250 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.573200 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.659617 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.662739 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.741101 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.746178 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.789587 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.804152 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.826949 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.830731 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.859666 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.944894 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 18:08:36 crc kubenswrapper[4926]: I0312 18:08:36.987417 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.003136 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.083670 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.111822 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.157638 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.404856 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.446706 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.549702 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.578531 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.603929 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.629622 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.744634 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.880364 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.964026 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 18:08:37 crc kubenswrapper[4926]: I0312 18:08:37.988672 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 18:08:38 crc kubenswrapper[4926]: I0312 18:08:38.006176 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 18:08:38 crc kubenswrapper[4926]: I0312 18:08:38.016154 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 18:08:38 crc kubenswrapper[4926]: I0312 18:08:38.089564 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 18:08:38 crc kubenswrapper[4926]: I0312 18:08:38.677662 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 18:08:38 crc kubenswrapper[4926]: I0312 18:08:38.755559 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 18:08:38 crc kubenswrapper[4926]: I0312 18:08:38.886309 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.184782 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.280676 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.378101 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.454958 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.461786 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.863286 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.871193 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.895709 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.898582 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 18:08:39 crc kubenswrapper[4926]: I0312 18:08:39.956542 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.032308 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.175852 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.301498 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.447492 4926 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.558360 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.597974 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.686957 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.750323 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.750420 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.831775 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.831863 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.831901 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.831927 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.831907 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.831946 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832042 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832086 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832358 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832657 4926 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832688 4926 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832700 4926 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.832711 4926 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.844661 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.933599 4926 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.940263 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555648-j9n25"] Mar 12 18:08:40 crc kubenswrapper[4926]: E0312 18:08:40.940471 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.940486 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 18:08:40 crc kubenswrapper[4926]: E0312 18:08:40.940502 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" containerName="installer" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.940508 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" containerName="installer" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.940585 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c14bbd-6ba6-42c2-9e94-bf3a6f68500f" containerName="installer" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.940596 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.940930 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.944983 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.945123 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.950854 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:08:40 crc kubenswrapper[4926]: I0312 18:08:40.953008 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555648-j9n25"] Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.034767 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wkg7\" (UniqueName: \"kubernetes.io/projected/8dbfae2a-83a6-4586-a33c-b9bcfd4df092-kube-api-access-2wkg7\") pod \"auto-csr-approver-29555648-j9n25\" (UID: \"8dbfae2a-83a6-4586-a33c-b9bcfd4df092\") " pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.135502 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wkg7\" (UniqueName: \"kubernetes.io/projected/8dbfae2a-83a6-4586-a33c-b9bcfd4df092-kube-api-access-2wkg7\") pod \"auto-csr-approver-29555648-j9n25\" (UID: \"8dbfae2a-83a6-4586-a33c-b9bcfd4df092\") " pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.155465 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wkg7\" (UniqueName: \"kubernetes.io/projected/8dbfae2a-83a6-4586-a33c-b9bcfd4df092-kube-api-access-2wkg7\") pod \"auto-csr-approver-29555648-j9n25\" (UID: \"8dbfae2a-83a6-4586-a33c-b9bcfd4df092\") " pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.254246 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.462081 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.462131 4926 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d" exitCode=137 Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.462171 4926 scope.go:117] "RemoveContainer" containerID="3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.462290 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.497069 4926 scope.go:117] "RemoveContainer" containerID="3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d" Mar 12 18:08:41 crc kubenswrapper[4926]: E0312 18:08:41.497944 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d\": container with ID starting with 3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d not found: ID does not exist" containerID="3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.497991 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d"} err="failed to get container status \"3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d\": rpc error: code = NotFound desc = could not find container \"3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d\": container with ID starting with 3fa55b458c434da936089801bf146dd071861b6c00213ac408d53f6b45376d4d not found: ID does not exist" Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.756781 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555648-j9n25"] Mar 12 18:08:41 crc kubenswrapper[4926]: I0312 18:08:41.833125 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.104895 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.468458 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555648-j9n25" event={"ID":"8dbfae2a-83a6-4586-a33c-b9bcfd4df092","Type":"ContainerStarted","Data":"4dd1bd1bd4a15c672854c7bab2a44c18a4e95ed9923be222520bdc3a11e519d2"} Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.496146 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.496458 4926 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.506852 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.506890 4926 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="416312af-d40e-46e7-8345-9f214dc327c3" Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.510523 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.510559 4926 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="416312af-d40e-46e7-8345-9f214dc327c3" Mar 12 18:08:42 crc kubenswrapper[4926]: I0312 18:08:42.528774 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 18:08:43 crc kubenswrapper[4926]: I0312 18:08:43.476060 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555648-j9n25" event={"ID":"8dbfae2a-83a6-4586-a33c-b9bcfd4df092","Type":"ContainerStarted","Data":"57bd95669fead78f455d353e126bbd0157addbc0b0171c28ad86c1bdee789263"} Mar 12 18:08:43 crc kubenswrapper[4926]: I0312 18:08:43.486691 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555648-j9n25" podStartSLOduration=2.070084461 podStartE2EDuration="3.486673164s" podCreationTimestamp="2026-03-12 18:08:40 +0000 UTC" firstStartedPulling="2026-03-12 18:08:41.753704053 +0000 UTC m=+362.122330386" lastFinishedPulling="2026-03-12 18:08:43.170292756 +0000 UTC m=+363.538919089" observedRunningTime="2026-03-12 18:08:43.486518339 +0000 UTC m=+363.855144672" watchObservedRunningTime="2026-03-12 18:08:43.486673164 +0000 UTC m=+363.855299497" Mar 12 18:08:44 crc kubenswrapper[4926]: I0312 18:08:44.481580 4926 generic.go:334] "Generic (PLEG): container finished" podID="8dbfae2a-83a6-4586-a33c-b9bcfd4df092" containerID="57bd95669fead78f455d353e126bbd0157addbc0b0171c28ad86c1bdee789263" exitCode=0 Mar 12 18:08:44 crc kubenswrapper[4926]: I0312 18:08:44.481625 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555648-j9n25" event={"ID":"8dbfae2a-83a6-4586-a33c-b9bcfd4df092","Type":"ContainerDied","Data":"57bd95669fead78f455d353e126bbd0157addbc0b0171c28ad86c1bdee789263"} Mar 12 18:08:45 crc kubenswrapper[4926]: I0312 18:08:45.836236 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:08:45 crc kubenswrapper[4926]: I0312 18:08:45.893563 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wkg7\" (UniqueName: \"kubernetes.io/projected/8dbfae2a-83a6-4586-a33c-b9bcfd4df092-kube-api-access-2wkg7\") pod \"8dbfae2a-83a6-4586-a33c-b9bcfd4df092\" (UID: \"8dbfae2a-83a6-4586-a33c-b9bcfd4df092\") " Mar 12 18:08:45 crc kubenswrapper[4926]: I0312 18:08:45.900957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbfae2a-83a6-4586-a33c-b9bcfd4df092-kube-api-access-2wkg7" (OuterVolumeSpecName: "kube-api-access-2wkg7") pod "8dbfae2a-83a6-4586-a33c-b9bcfd4df092" (UID: "8dbfae2a-83a6-4586-a33c-b9bcfd4df092"). InnerVolumeSpecName "kube-api-access-2wkg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:08:45 crc kubenswrapper[4926]: I0312 18:08:45.994330 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wkg7\" (UniqueName: \"kubernetes.io/projected/8dbfae2a-83a6-4586-a33c-b9bcfd4df092-kube-api-access-2wkg7\") on node \"crc\" DevicePath \"\"" Mar 12 18:08:46 crc kubenswrapper[4926]: I0312 18:08:46.498020 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555648-j9n25" event={"ID":"8dbfae2a-83a6-4586-a33c-b9bcfd4df092","Type":"ContainerDied","Data":"4dd1bd1bd4a15c672854c7bab2a44c18a4e95ed9923be222520bdc3a11e519d2"} Mar 12 18:08:46 crc kubenswrapper[4926]: I0312 18:08:46.498105 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dd1bd1bd4a15c672854c7bab2a44c18a4e95ed9923be222520bdc3a11e519d2" Mar 12 18:08:46 crc kubenswrapper[4926]: I0312 18:08:46.498209 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555648-j9n25" Mar 12 18:09:00 crc kubenswrapper[4926]: I0312 18:09:00.590978 4926 generic.go:334] "Generic (PLEG): container finished" podID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerID="c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec" exitCode=0 Mar 12 18:09:00 crc kubenswrapper[4926]: I0312 18:09:00.591172 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" event={"ID":"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b","Type":"ContainerDied","Data":"c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec"} Mar 12 18:09:00 crc kubenswrapper[4926]: I0312 18:09:00.592628 4926 scope.go:117] "RemoveContainer" containerID="c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec" Mar 12 18:09:01 crc kubenswrapper[4926]: I0312 18:09:01.602865 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" event={"ID":"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b","Type":"ContainerStarted","Data":"356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91"} Mar 12 18:09:01 crc kubenswrapper[4926]: I0312 18:09:01.603294 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:09:01 crc kubenswrapper[4926]: I0312 18:09:01.605690 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:09:56 crc kubenswrapper[4926]: I0312 18:09:56.817733 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:09:56 crc kubenswrapper[4926]: I0312 18:09:56.818740 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.780324 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4xt9k"] Mar 12 18:09:59 crc kubenswrapper[4926]: E0312 18:09:59.780651 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dbfae2a-83a6-4586-a33c-b9bcfd4df092" containerName="oc" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.780670 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbfae2a-83a6-4586-a33c-b9bcfd4df092" containerName="oc" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.780839 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dbfae2a-83a6-4586-a33c-b9bcfd4df092" containerName="oc" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.781357 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.804242 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4xt9k"] Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962146 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61de267b-cc52-4384-a66f-e8c2f5ddac13-trusted-ca\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962205 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61de267b-cc52-4384-a66f-e8c2f5ddac13-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962323 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61de267b-cc52-4384-a66f-e8c2f5ddac13-registry-certificates\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962365 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-registry-tls\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962404 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9dg\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-kube-api-access-bx9dg\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962455 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-bound-sa-token\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962531 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.962619 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61de267b-cc52-4384-a66f-e8c2f5ddac13-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:09:59 crc kubenswrapper[4926]: I0312 18:09:59.986320 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063412 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-registry-tls\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063697 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61de267b-cc52-4384-a66f-e8c2f5ddac13-registry-certificates\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063727 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9dg\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-kube-api-access-bx9dg\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063747 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-bound-sa-token\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063788 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61de267b-cc52-4384-a66f-e8c2f5ddac13-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063822 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61de267b-cc52-4384-a66f-e8c2f5ddac13-trusted-ca\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.063852 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61de267b-cc52-4384-a66f-e8c2f5ddac13-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.064432 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61de267b-cc52-4384-a66f-e8c2f5ddac13-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.064977 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61de267b-cc52-4384-a66f-e8c2f5ddac13-trusted-ca\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.065017 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61de267b-cc52-4384-a66f-e8c2f5ddac13-registry-certificates\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.068745 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-registry-tls\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.069610 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61de267b-cc52-4384-a66f-e8c2f5ddac13-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.088688 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-bound-sa-token\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.088727 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9dg\" (UniqueName: \"kubernetes.io/projected/61de267b-cc52-4384-a66f-e8c2f5ddac13-kube-api-access-bx9dg\") pod \"image-registry-66df7c8f76-4xt9k\" (UID: \"61de267b-cc52-4384-a66f-e8c2f5ddac13\") " pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.104690 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.127811 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555650-t22sq"] Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.128572 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.130667 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.130922 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.131015 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.142026 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555650-t22sq"] Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.266048 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgl86\" (UniqueName: \"kubernetes.io/projected/5a5ff64f-2478-4592-8a08-fb47a40a8de5-kube-api-access-cgl86\") pod \"auto-csr-approver-29555650-t22sq\" (UID: \"5a5ff64f-2478-4592-8a08-fb47a40a8de5\") " pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.286381 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4xt9k"] Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.367219 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgl86\" (UniqueName: \"kubernetes.io/projected/5a5ff64f-2478-4592-8a08-fb47a40a8de5-kube-api-access-cgl86\") pod \"auto-csr-approver-29555650-t22sq\" (UID: \"5a5ff64f-2478-4592-8a08-fb47a40a8de5\") " pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.389122 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgl86\" (UniqueName: \"kubernetes.io/projected/5a5ff64f-2478-4592-8a08-fb47a40a8de5-kube-api-access-cgl86\") pod \"auto-csr-approver-29555650-t22sq\" (UID: \"5a5ff64f-2478-4592-8a08-fb47a40a8de5\") " pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.465053 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.626055 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555650-t22sq"] Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.979589 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" event={"ID":"61de267b-cc52-4384-a66f-e8c2f5ddac13","Type":"ContainerStarted","Data":"a3ae85736cc481a99989d3e010a953e8a8bbf4438c14860f036e5e3ad93aaaa7"} Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.979976 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" event={"ID":"61de267b-cc52-4384-a66f-e8c2f5ddac13","Type":"ContainerStarted","Data":"3d9227b85d6bab68548e3175d5cc29dfcb8aacc3002008bb43918526dfec9c68"} Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.980006 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.980920 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555650-t22sq" event={"ID":"5a5ff64f-2478-4592-8a08-fb47a40a8de5","Type":"ContainerStarted","Data":"a7f386311505d74c38a0b78f7af734c7c38cca55a7266dfa47d2726163b7e6ff"} Mar 12 18:10:00 crc kubenswrapper[4926]: I0312 18:10:00.997289 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" podStartSLOduration=1.997270171 podStartE2EDuration="1.997270171s" podCreationTimestamp="2026-03-12 18:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:10:00.99722775 +0000 UTC m=+441.365854113" watchObservedRunningTime="2026-03-12 18:10:00.997270171 +0000 UTC m=+441.365896504" Mar 12 18:10:03 crc kubenswrapper[4926]: I0312 18:10:03.000049 4926 generic.go:334] "Generic (PLEG): container finished" podID="5a5ff64f-2478-4592-8a08-fb47a40a8de5" containerID="1d8805e76417daef23d168ee9dafcbdf012b728e6e9278f288005edef5f23d20" exitCode=0 Mar 12 18:10:03 crc kubenswrapper[4926]: I0312 18:10:03.000523 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555650-t22sq" event={"ID":"5a5ff64f-2478-4592-8a08-fb47a40a8de5","Type":"ContainerDied","Data":"1d8805e76417daef23d168ee9dafcbdf012b728e6e9278f288005edef5f23d20"} Mar 12 18:10:04 crc kubenswrapper[4926]: I0312 18:10:04.365604 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:04 crc kubenswrapper[4926]: I0312 18:10:04.532571 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgl86\" (UniqueName: \"kubernetes.io/projected/5a5ff64f-2478-4592-8a08-fb47a40a8de5-kube-api-access-cgl86\") pod \"5a5ff64f-2478-4592-8a08-fb47a40a8de5\" (UID: \"5a5ff64f-2478-4592-8a08-fb47a40a8de5\") " Mar 12 18:10:04 crc kubenswrapper[4926]: I0312 18:10:04.538978 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5ff64f-2478-4592-8a08-fb47a40a8de5-kube-api-access-cgl86" (OuterVolumeSpecName: "kube-api-access-cgl86") pod "5a5ff64f-2478-4592-8a08-fb47a40a8de5" (UID: "5a5ff64f-2478-4592-8a08-fb47a40a8de5"). InnerVolumeSpecName "kube-api-access-cgl86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:04 crc kubenswrapper[4926]: I0312 18:10:04.633722 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgl86\" (UniqueName: \"kubernetes.io/projected/5a5ff64f-2478-4592-8a08-fb47a40a8de5-kube-api-access-cgl86\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:05 crc kubenswrapper[4926]: I0312 18:10:05.020650 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555650-t22sq" event={"ID":"5a5ff64f-2478-4592-8a08-fb47a40a8de5","Type":"ContainerDied","Data":"a7f386311505d74c38a0b78f7af734c7c38cca55a7266dfa47d2726163b7e6ff"} Mar 12 18:10:05 crc kubenswrapper[4926]: I0312 18:10:05.021110 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f386311505d74c38a0b78f7af734c7c38cca55a7266dfa47d2726163b7e6ff" Mar 12 18:10:05 crc kubenswrapper[4926]: I0312 18:10:05.020735 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555650-t22sq" Mar 12 18:10:20 crc kubenswrapper[4926]: I0312 18:10:20.109162 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4xt9k" Mar 12 18:10:20 crc kubenswrapper[4926]: I0312 18:10:20.199855 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6fzt"] Mar 12 18:10:26 crc kubenswrapper[4926]: I0312 18:10:26.817641 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:10:26 crc kubenswrapper[4926]: I0312 18:10:26.818306 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.660055 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-565fl"] Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.660653 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-565fl" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="registry-server" containerID="cri-o://a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8" gracePeriod=30 Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.672547 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s4zf5"] Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.674158 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s4zf5" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="registry-server" containerID="cri-o://1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e" gracePeriod=30 Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.684001 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c68kr"] Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.684275 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" containerID="cri-o://356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91" gracePeriod=30 Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.692342 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz6qt"] Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.692568 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wz6qt" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="registry-server" containerID="cri-o://9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83" gracePeriod=30 Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.698882 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d9gx5"] Mar 12 18:10:28 crc kubenswrapper[4926]: E0312 18:10:28.699179 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5ff64f-2478-4592-8a08-fb47a40a8de5" containerName="oc" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.699199 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5ff64f-2478-4592-8a08-fb47a40a8de5" containerName="oc" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.699334 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5ff64f-2478-4592-8a08-fb47a40a8de5" containerName="oc" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.699826 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.702097 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zx9h7"] Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.702304 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zx9h7" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="registry-server" containerID="cri-o://2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5" gracePeriod=30 Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.711748 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/daeebaaf-6a69-436e-b341-36fae756599e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.711810 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/daeebaaf-6a69-436e-b341-36fae756599e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.711882 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzgn\" (UniqueName: \"kubernetes.io/projected/daeebaaf-6a69-436e-b341-36fae756599e-kube-api-access-tmzgn\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.714574 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d9gx5"] Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.812595 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/daeebaaf-6a69-436e-b341-36fae756599e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.813038 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/daeebaaf-6a69-436e-b341-36fae756599e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.813186 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzgn\" (UniqueName: \"kubernetes.io/projected/daeebaaf-6a69-436e-b341-36fae756599e-kube-api-access-tmzgn\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.814511 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/daeebaaf-6a69-436e-b341-36fae756599e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.821805 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/daeebaaf-6a69-436e-b341-36fae756599e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:28 crc kubenswrapper[4926]: I0312 18:10:28.839533 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzgn\" (UniqueName: \"kubernetes.io/projected/daeebaaf-6a69-436e-b341-36fae756599e-kube-api-access-tmzgn\") pod \"marketplace-operator-79b997595-d9gx5\" (UID: \"daeebaaf-6a69-436e-b341-36fae756599e\") " pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.031546 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.078288 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.116170 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.148929 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.153341 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.154641 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.192647 4926 generic.go:334] "Generic (PLEG): container finished" podID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerID="9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83" exitCode=0 Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.192985 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz6qt" event={"ID":"702daa2d-851e-4c3d-be86-4f337b4462f7","Type":"ContainerDied","Data":"9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.193009 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz6qt" event={"ID":"702daa2d-851e-4c3d-be86-4f337b4462f7","Type":"ContainerDied","Data":"1ca7001bbffd391adca6fddca7abaa8f8e5f4a6dec4a426455ef61c2d31fac2d"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.193026 4926 scope.go:117] "RemoveContainer" containerID="9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.193126 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz6qt" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.198948 4926 generic.go:334] "Generic (PLEG): container finished" podID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerID="356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91" exitCode=0 Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.199019 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" event={"ID":"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b","Type":"ContainerDied","Data":"356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.199047 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" event={"ID":"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b","Type":"ContainerDied","Data":"7a6f3ba59fc7e8583a370a6e58f7ad7026a2677d263e98044721d21503f15589"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.199096 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-c68kr" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.208218 4926 generic.go:334] "Generic (PLEG): container finished" podID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerID="a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8" exitCode=0 Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.208277 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-565fl" event={"ID":"b5fe4032-6a1e-4c27-9471-fa53e044826e","Type":"ContainerDied","Data":"a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.208301 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-565fl" event={"ID":"b5fe4032-6a1e-4c27-9471-fa53e044826e","Type":"ContainerDied","Data":"2ea4f4bc9037632e2b998da8312ddbb98d227eabc09259bd968b95eab6d3b562"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.208350 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-565fl" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.212081 4926 generic.go:334] "Generic (PLEG): container finished" podID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerID="1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e" exitCode=0 Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.212134 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4zf5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.212133 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4zf5" event={"ID":"b2a609cd-c298-4356-9ddf-a7f125b52938","Type":"ContainerDied","Data":"1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.212177 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4zf5" event={"ID":"b2a609cd-c298-4356-9ddf-a7f125b52938","Type":"ContainerDied","Data":"6da98836531143c0f0c9e5b0f54f53b5a8c52f32a98b1d55d2cd5e57c0ec7c9e"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.216097 4926 generic.go:334] "Generic (PLEG): container finished" podID="637236a6-6287-401d-a2cd-78713aa03176" containerID="2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5" exitCode=0 Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.216144 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx9h7" event={"ID":"637236a6-6287-401d-a2cd-78713aa03176","Type":"ContainerDied","Data":"2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.216196 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx9h7" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.216207 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx9h7" event={"ID":"637236a6-6287-401d-a2cd-78713aa03176","Type":"ContainerDied","Data":"b41dbdfd44a141e983428f384fe92d330459deb4bd26d614487e897aef5edb00"} Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.218081 4926 scope.go:117] "RemoveContainer" containerID="946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.226094 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-catalog-content\") pod \"b5fe4032-6a1e-4c27-9471-fa53e044826e\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.226144 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-catalog-content\") pod \"b2a609cd-c298-4356-9ddf-a7f125b52938\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.226179 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf7tf\" (UniqueName: \"kubernetes.io/projected/b2a609cd-c298-4356-9ddf-a7f125b52938-kube-api-access-wf7tf\") pod \"b2a609cd-c298-4356-9ddf-a7f125b52938\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.226205 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-utilities\") pod \"b2a609cd-c298-4356-9ddf-a7f125b52938\" (UID: \"b2a609cd-c298-4356-9ddf-a7f125b52938\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.226279 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-utilities\") pod \"b5fe4032-6a1e-4c27-9471-fa53e044826e\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.226302 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wxfn\" (UniqueName: \"kubernetes.io/projected/b5fe4032-6a1e-4c27-9471-fa53e044826e-kube-api-access-2wxfn\") pod \"b5fe4032-6a1e-4c27-9471-fa53e044826e\" (UID: \"b5fe4032-6a1e-4c27-9471-fa53e044826e\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.229633 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-utilities" (OuterVolumeSpecName: "utilities") pod "b2a609cd-c298-4356-9ddf-a7f125b52938" (UID: "b2a609cd-c298-4356-9ddf-a7f125b52938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.229635 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-utilities" (OuterVolumeSpecName: "utilities") pod "b5fe4032-6a1e-4c27-9471-fa53e044826e" (UID: "b5fe4032-6a1e-4c27-9471-fa53e044826e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.234422 4926 scope.go:117] "RemoveContainer" containerID="7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.238491 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a609cd-c298-4356-9ddf-a7f125b52938-kube-api-access-wf7tf" (OuterVolumeSpecName: "kube-api-access-wf7tf") pod "b2a609cd-c298-4356-9ddf-a7f125b52938" (UID: "b2a609cd-c298-4356-9ddf-a7f125b52938"). InnerVolumeSpecName "kube-api-access-wf7tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.244632 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5fe4032-6a1e-4c27-9471-fa53e044826e-kube-api-access-2wxfn" (OuterVolumeSpecName: "kube-api-access-2wxfn") pod "b5fe4032-6a1e-4c27-9471-fa53e044826e" (UID: "b5fe4032-6a1e-4c27-9471-fa53e044826e"). InnerVolumeSpecName "kube-api-access-2wxfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.253645 4926 scope.go:117] "RemoveContainer" containerID="9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.254671 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83\": container with ID starting with 9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83 not found: ID does not exist" containerID="9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.254720 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83"} err="failed to get container status \"9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83\": rpc error: code = NotFound desc = could not find container \"9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83\": container with ID starting with 9865a7f38f113c6dde3c65a4e033ac09e71fbbe0591d4c816afba1cdad51af83 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.254778 4926 scope.go:117] "RemoveContainer" containerID="946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.256667 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b\": container with ID starting with 946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b not found: ID does not exist" containerID="946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.256708 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b"} err="failed to get container status \"946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b\": rpc error: code = NotFound desc = could not find container \"946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b\": container with ID starting with 946cba603efe15d0422595010c1330546d61cfb532c27ac70fff29c0e4b7670b not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.256733 4926 scope.go:117] "RemoveContainer" containerID="7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.257014 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa\": container with ID starting with 7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa not found: ID does not exist" containerID="7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.257047 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa"} err="failed to get container status \"7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa\": rpc error: code = NotFound desc = could not find container \"7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa\": container with ID starting with 7dc08056a58cbc293791a80a85351d04a9b2abbd923938014309511e4c4926aa not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.257067 4926 scope.go:117] "RemoveContainer" containerID="356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.268043 4926 scope.go:117] "RemoveContainer" containerID="c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.281475 4926 scope.go:117] "RemoveContainer" containerID="356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.281834 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91\": container with ID starting with 356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91 not found: ID does not exist" containerID="356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.281860 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91"} err="failed to get container status \"356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91\": rpc error: code = NotFound desc = could not find container \"356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91\": container with ID starting with 356f40aae500e05628cc7650828bff417d73534ec79c5865296b52de2f45db91 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.281877 4926 scope.go:117] "RemoveContainer" containerID="c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.282049 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec\": container with ID starting with c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec not found: ID does not exist" containerID="c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.282065 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec"} err="failed to get container status \"c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec\": rpc error: code = NotFound desc = could not find container \"c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec\": container with ID starting with c343107391a1e08d133d97a7a02674714d51675df1d543bc4b787c1e00fb0bec not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.282078 4926 scope.go:117] "RemoveContainer" containerID="a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.288121 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2a609cd-c298-4356-9ddf-a7f125b52938" (UID: "b2a609cd-c298-4356-9ddf-a7f125b52938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.289166 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5fe4032-6a1e-4c27-9471-fa53e044826e" (UID: "b5fe4032-6a1e-4c27-9471-fa53e044826e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.301283 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-d9gx5"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.303493 4926 scope.go:117] "RemoveContainer" containerID="d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5" Mar 12 18:10:29 crc kubenswrapper[4926]: W0312 18:10:29.316857 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaeebaaf_6a69_436e_b341_36fae756599e.slice/crio-cb5bce34c6486bdd5b618e103d0c85368e73bb247bb05dc441733b928f66d9e4 WatchSource:0}: Error finding container cb5bce34c6486bdd5b618e103d0c85368e73bb247bb05dc441733b928f66d9e4: Status 404 returned error can't find the container with id cb5bce34c6486bdd5b618e103d0c85368e73bb247bb05dc441733b928f66d9e4 Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.320635 4926 scope.go:117] "RemoveContainer" containerID="fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327243 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdq7x\" (UniqueName: \"kubernetes.io/projected/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-kube-api-access-vdq7x\") pod \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327304 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-utilities\") pod \"702daa2d-851e-4c3d-be86-4f337b4462f7\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327330 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-catalog-content\") pod \"702daa2d-851e-4c3d-be86-4f337b4462f7\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327390 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-trusted-ca\") pod \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327409 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57ncv\" (UniqueName: \"kubernetes.io/projected/637236a6-6287-401d-a2cd-78713aa03176-kube-api-access-57ncv\") pod \"637236a6-6287-401d-a2cd-78713aa03176\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327495 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-utilities\") pod \"637236a6-6287-401d-a2cd-78713aa03176\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327513 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-operator-metrics\") pod \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\" (UID: \"16fff1b6-a4ca-4ea9-aaba-40e9c136f62b\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327570 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-catalog-content\") pod \"637236a6-6287-401d-a2cd-78713aa03176\" (UID: \"637236a6-6287-401d-a2cd-78713aa03176\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.327625 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwlr2\" (UniqueName: \"kubernetes.io/projected/702daa2d-851e-4c3d-be86-4f337b4462f7-kube-api-access-nwlr2\") pod \"702daa2d-851e-4c3d-be86-4f337b4462f7\" (UID: \"702daa2d-851e-4c3d-be86-4f337b4462f7\") " Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328267 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" (UID: "16fff1b6-a4ca-4ea9-aaba-40e9c136f62b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328728 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328767 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328777 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2a609cd-c298-4356-9ddf-a7f125b52938-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328786 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf7tf\" (UniqueName: \"kubernetes.io/projected/b2a609cd-c298-4356-9ddf-a7f125b52938-kube-api-access-wf7tf\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328796 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5fe4032-6a1e-4c27-9471-fa53e044826e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328805 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wxfn\" (UniqueName: \"kubernetes.io/projected/b5fe4032-6a1e-4c27-9471-fa53e044826e-kube-api-access-2wxfn\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.328835 4926 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.329303 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-utilities" (OuterVolumeSpecName: "utilities") pod "637236a6-6287-401d-a2cd-78713aa03176" (UID: "637236a6-6287-401d-a2cd-78713aa03176"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.330077 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" (UID: "16fff1b6-a4ca-4ea9-aaba-40e9c136f62b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.330527 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-kube-api-access-vdq7x" (OuterVolumeSpecName: "kube-api-access-vdq7x") pod "16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" (UID: "16fff1b6-a4ca-4ea9-aaba-40e9c136f62b"). InnerVolumeSpecName "kube-api-access-vdq7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.331388 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702daa2d-851e-4c3d-be86-4f337b4462f7-kube-api-access-nwlr2" (OuterVolumeSpecName: "kube-api-access-nwlr2") pod "702daa2d-851e-4c3d-be86-4f337b4462f7" (UID: "702daa2d-851e-4c3d-be86-4f337b4462f7"). InnerVolumeSpecName "kube-api-access-nwlr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.332267 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-utilities" (OuterVolumeSpecName: "utilities") pod "702daa2d-851e-4c3d-be86-4f337b4462f7" (UID: "702daa2d-851e-4c3d-be86-4f337b4462f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.340505 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637236a6-6287-401d-a2cd-78713aa03176-kube-api-access-57ncv" (OuterVolumeSpecName: "kube-api-access-57ncv") pod "637236a6-6287-401d-a2cd-78713aa03176" (UID: "637236a6-6287-401d-a2cd-78713aa03176"). InnerVolumeSpecName "kube-api-access-57ncv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.355272 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "702daa2d-851e-4c3d-be86-4f337b4462f7" (UID: "702daa2d-851e-4c3d-be86-4f337b4462f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.357411 4926 scope.go:117] "RemoveContainer" containerID="a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.357828 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8\": container with ID starting with a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8 not found: ID does not exist" containerID="a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.357870 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8"} err="failed to get container status \"a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8\": rpc error: code = NotFound desc = could not find container \"a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8\": container with ID starting with a7cffeb42c14a337a70a12925c231b7706bf9cf89e483b52294bc445995635e8 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.357900 4926 scope.go:117] "RemoveContainer" containerID="d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.358245 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5\": container with ID starting with d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5 not found: ID does not exist" containerID="d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.358291 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5"} err="failed to get container status \"d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5\": rpc error: code = NotFound desc = could not find container \"d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5\": container with ID starting with d0ec30d7572ff20ab50a53e18ca39227cc44a4942924ccee935f4244f66215a5 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.358304 4926 scope.go:117] "RemoveContainer" containerID="fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.358735 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8\": container with ID starting with fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8 not found: ID does not exist" containerID="fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.358763 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8"} err="failed to get container status \"fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8\": rpc error: code = NotFound desc = could not find container \"fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8\": container with ID starting with fb58a38d637092f295d67153f35c6b3f40a27bae368a932eb26c3f8584a612c8 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.358781 4926 scope.go:117] "RemoveContainer" containerID="1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.375416 4926 scope.go:117] "RemoveContainer" containerID="f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.389905 4926 scope.go:117] "RemoveContainer" containerID="12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.402607 4926 scope.go:117] "RemoveContainer" containerID="1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.403245 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e\": container with ID starting with 1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e not found: ID does not exist" containerID="1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.403339 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e"} err="failed to get container status \"1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e\": rpc error: code = NotFound desc = could not find container \"1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e\": container with ID starting with 1e7c10918fb2662f0c3c2d62f7e215a766fbc3071c1ee00eafa305241344c76e not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.403421 4926 scope.go:117] "RemoveContainer" containerID="f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.403808 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc\": container with ID starting with f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc not found: ID does not exist" containerID="f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.403916 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc"} err="failed to get container status \"f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc\": rpc error: code = NotFound desc = could not find container \"f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc\": container with ID starting with f38216f8913635b4032c7273cdec95f688341e0baf732831b6a09cf20c9441dc not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.403981 4926 scope.go:117] "RemoveContainer" containerID="12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.404253 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447\": container with ID starting with 12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447 not found: ID does not exist" containerID="12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.404327 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447"} err="failed to get container status \"12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447\": rpc error: code = NotFound desc = could not find container \"12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447\": container with ID starting with 12072b6b2001c513ccb9e0bfcc29df831e119de1d1ab215530bbc5c9682cd447 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.404400 4926 scope.go:117] "RemoveContainer" containerID="2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.420041 4926 scope.go:117] "RemoveContainer" containerID="599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430120 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430153 4926 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430164 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwlr2\" (UniqueName: \"kubernetes.io/projected/702daa2d-851e-4c3d-be86-4f337b4462f7-kube-api-access-nwlr2\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430174 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdq7x\" (UniqueName: \"kubernetes.io/projected/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b-kube-api-access-vdq7x\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430183 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430192 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/702daa2d-851e-4c3d-be86-4f337b4462f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.430200 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57ncv\" (UniqueName: \"kubernetes.io/projected/637236a6-6287-401d-a2cd-78713aa03176-kube-api-access-57ncv\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.443705 4926 scope.go:117] "RemoveContainer" containerID="76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.460133 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "637236a6-6287-401d-a2cd-78713aa03176" (UID: "637236a6-6287-401d-a2cd-78713aa03176"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.468230 4926 scope.go:117] "RemoveContainer" containerID="2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.469625 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5\": container with ID starting with 2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5 not found: ID does not exist" containerID="2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.469663 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5"} err="failed to get container status \"2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5\": rpc error: code = NotFound desc = could not find container \"2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5\": container with ID starting with 2c8c0fa29e4cec7c2b5e010a9e7a63d04b2f8d113b9bc5720d7a9f2830e056e5 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.469688 4926 scope.go:117] "RemoveContainer" containerID="599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.470304 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb\": container with ID starting with 599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb not found: ID does not exist" containerID="599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.470332 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb"} err="failed to get container status \"599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb\": rpc error: code = NotFound desc = could not find container \"599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb\": container with ID starting with 599c070b0f1e1e00eef408d7433fad843523b69117524e7dec85b71cf79b99cb not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.470352 4926 scope.go:117] "RemoveContainer" containerID="76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1" Mar 12 18:10:29 crc kubenswrapper[4926]: E0312 18:10:29.470693 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1\": container with ID starting with 76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1 not found: ID does not exist" containerID="76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.470714 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1"} err="failed to get container status \"76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1\": rpc error: code = NotFound desc = could not find container \"76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1\": container with ID starting with 76aab1cc6cdb175ca5208c74f8c9cf48afd4b5c01686d1d1f1fa4ff001aa81c1 not found: ID does not exist" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.530705 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz6qt"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.531668 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/637236a6-6287-401d-a2cd-78713aa03176-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.536155 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz6qt"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.549771 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c68kr"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.554694 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-c68kr"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.559407 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zx9h7"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.567107 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zx9h7"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.570590 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s4zf5"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.582771 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s4zf5"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.586874 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-565fl"] Mar 12 18:10:29 crc kubenswrapper[4926]: I0312 18:10:29.589839 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-565fl"] Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.226249 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" event={"ID":"daeebaaf-6a69-436e-b341-36fae756599e","Type":"ContainerStarted","Data":"e8e29782de63bdb9556d9ad371acf39d64a6ad349c1167e79b44e09709cbf79a"} Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.226602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" event={"ID":"daeebaaf-6a69-436e-b341-36fae756599e","Type":"ContainerStarted","Data":"cb5bce34c6486bdd5b618e103d0c85368e73bb247bb05dc441733b928f66d9e4"} Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.227638 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.232068 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.249922 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-d9gx5" podStartSLOduration=2.249908206 podStartE2EDuration="2.249908206s" podCreationTimestamp="2026-03-12 18:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:10:30.24779349 +0000 UTC m=+470.616419833" watchObservedRunningTime="2026-03-12 18:10:30.249908206 +0000 UTC m=+470.618534539" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481042 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-265z5"] Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481214 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481227 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481236 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481242 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481248 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481254 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481264 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481269 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481278 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481283 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481292 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481298 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481307 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481313 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481325 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481331 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481337 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481343 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481352 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481370 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="extract-content" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481378 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481384 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481392 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481397 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481405 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481411 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: E0312 18:10:30.481420 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481448 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="extract-utilities" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481552 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481561 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481569 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481579 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481587 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="637236a6-6287-401d-a2cd-78713aa03176" containerName="registry-server" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.481723 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" containerName="marketplace-operator" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.482189 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.493710 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.536140 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fff1b6-a4ca-4ea9-aaba-40e9c136f62b" path="/var/lib/kubelet/pods/16fff1b6-a4ca-4ea9-aaba-40e9c136f62b/volumes" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.537859 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637236a6-6287-401d-a2cd-78713aa03176" path="/var/lib/kubelet/pods/637236a6-6287-401d-a2cd-78713aa03176/volumes" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.539368 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702daa2d-851e-4c3d-be86-4f337b4462f7" path="/var/lib/kubelet/pods/702daa2d-851e-4c3d-be86-4f337b4462f7/volumes" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.541751 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a609cd-c298-4356-9ddf-a7f125b52938" path="/var/lib/kubelet/pods/b2a609cd-c298-4356-9ddf-a7f125b52938/volumes" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.543625 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5fe4032-6a1e-4c27-9471-fa53e044826e" path="/var/lib/kubelet/pods/b5fe4032-6a1e-4c27-9471-fa53e044826e/volumes" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.545382 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-265z5"] Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.646062 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f746e64-bce3-4f58-b789-0f5573e28847-utilities\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.646226 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx687\" (UniqueName: \"kubernetes.io/projected/0f746e64-bce3-4f58-b789-0f5573e28847-kube-api-access-xx687\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.646345 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f746e64-bce3-4f58-b789-0f5573e28847-catalog-content\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.746976 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f746e64-bce3-4f58-b789-0f5573e28847-catalog-content\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.747347 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f746e64-bce3-4f58-b789-0f5573e28847-utilities\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.747513 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx687\" (UniqueName: \"kubernetes.io/projected/0f746e64-bce3-4f58-b789-0f5573e28847-kube-api-access-xx687\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.747525 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f746e64-bce3-4f58-b789-0f5573e28847-catalog-content\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.748056 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f746e64-bce3-4f58-b789-0f5573e28847-utilities\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.775104 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx687\" (UniqueName: \"kubernetes.io/projected/0f746e64-bce3-4f58-b789-0f5573e28847-kube-api-access-xx687\") pod \"redhat-marketplace-265z5\" (UID: \"0f746e64-bce3-4f58-b789-0f5573e28847\") " pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:30 crc kubenswrapper[4926]: I0312 18:10:30.818027 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.008273 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-265z5"] Mar 12 18:10:31 crc kubenswrapper[4926]: W0312 18:10:31.016027 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f746e64_bce3_4f58_b789_0f5573e28847.slice/crio-27b0866546d967bb33c43963d5bf82a711ca8438fc5ee042557732aa8757e0a9 WatchSource:0}: Error finding container 27b0866546d967bb33c43963d5bf82a711ca8438fc5ee042557732aa8757e0a9: Status 404 returned error can't find the container with id 27b0866546d967bb33c43963d5bf82a711ca8438fc5ee042557732aa8757e0a9 Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.241858 4926 generic.go:334] "Generic (PLEG): container finished" podID="0f746e64-bce3-4f58-b789-0f5573e28847" containerID="527b0f3339ffe4707231354613a46abcba6e7c41812696c2338da4db5088147d" exitCode=0 Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.242091 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-265z5" event={"ID":"0f746e64-bce3-4f58-b789-0f5573e28847","Type":"ContainerDied","Data":"527b0f3339ffe4707231354613a46abcba6e7c41812696c2338da4db5088147d"} Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.242151 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-265z5" event={"ID":"0f746e64-bce3-4f58-b789-0f5573e28847","Type":"ContainerStarted","Data":"27b0866546d967bb33c43963d5bf82a711ca8438fc5ee042557732aa8757e0a9"} Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.479261 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-672mz"] Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.480952 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.483523 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.494506 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-672mz"] Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.661290 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-utilities\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.661360 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6nj\" (UniqueName: \"kubernetes.io/projected/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-kube-api-access-ck6nj\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.661501 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-catalog-content\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.763185 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-utilities\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.763241 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6nj\" (UniqueName: \"kubernetes.io/projected/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-kube-api-access-ck6nj\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.763304 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-catalog-content\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.763787 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-catalog-content\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.763941 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-utilities\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.786869 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6nj\" (UniqueName: \"kubernetes.io/projected/c9f79c44-e93f-48ba-9f2d-8a5b61a86089-kube-api-access-ck6nj\") pod \"redhat-operators-672mz\" (UID: \"c9f79c44-e93f-48ba-9f2d-8a5b61a86089\") " pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:31 crc kubenswrapper[4926]: I0312 18:10:31.851824 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.088881 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-672mz"] Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.248072 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672mz" event={"ID":"c9f79c44-e93f-48ba-9f2d-8a5b61a86089","Type":"ContainerStarted","Data":"9d6815757925c31c05247e630d2a9254a2ea04f08098a0932fc5e7a8816ffd88"} Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.248138 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672mz" event={"ID":"c9f79c44-e93f-48ba-9f2d-8a5b61a86089","Type":"ContainerStarted","Data":"aa3a7f181e123af1866cec91223954325e07acf8dcad47201ae58efcffb9e372"} Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.889012 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8lwn"] Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.894141 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.900926 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 18:10:32 crc kubenswrapper[4926]: I0312 18:10:32.902535 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8lwn"] Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.027626 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef972e3-81c7-4b62-aa07-4939aef86a2d-catalog-content\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.027687 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef972e3-81c7-4b62-aa07-4939aef86a2d-utilities\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.027727 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f285p\" (UniqueName: \"kubernetes.io/projected/0ef972e3-81c7-4b62-aa07-4939aef86a2d-kube-api-access-f285p\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.128714 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef972e3-81c7-4b62-aa07-4939aef86a2d-utilities\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.129152 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f285p\" (UniqueName: \"kubernetes.io/projected/0ef972e3-81c7-4b62-aa07-4939aef86a2d-kube-api-access-f285p\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.129380 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef972e3-81c7-4b62-aa07-4939aef86a2d-catalog-content\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.129509 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ef972e3-81c7-4b62-aa07-4939aef86a2d-utilities\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.129943 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ef972e3-81c7-4b62-aa07-4939aef86a2d-catalog-content\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.149693 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f285p\" (UniqueName: \"kubernetes.io/projected/0ef972e3-81c7-4b62-aa07-4939aef86a2d-kube-api-access-f285p\") pod \"certified-operators-t8lwn\" (UID: \"0ef972e3-81c7-4b62-aa07-4939aef86a2d\") " pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.226157 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.258131 4926 generic.go:334] "Generic (PLEG): container finished" podID="c9f79c44-e93f-48ba-9f2d-8a5b61a86089" containerID="9d6815757925c31c05247e630d2a9254a2ea04f08098a0932fc5e7a8816ffd88" exitCode=0 Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.258420 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672mz" event={"ID":"c9f79c44-e93f-48ba-9f2d-8a5b61a86089","Type":"ContainerDied","Data":"9d6815757925c31c05247e630d2a9254a2ea04f08098a0932fc5e7a8816ffd88"} Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.264083 4926 generic.go:334] "Generic (PLEG): container finished" podID="0f746e64-bce3-4f58-b789-0f5573e28847" containerID="0508d641f408a89ec3f17b2ac05f9ede131af67d070a5d4c8ba77d8f2743a6d5" exitCode=0 Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.264147 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-265z5" event={"ID":"0f746e64-bce3-4f58-b789-0f5573e28847","Type":"ContainerDied","Data":"0508d641f408a89ec3f17b2ac05f9ede131af67d070a5d4c8ba77d8f2743a6d5"} Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.456748 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8lwn"] Mar 12 18:10:33 crc kubenswrapper[4926]: W0312 18:10:33.461214 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef972e3_81c7_4b62_aa07_4939aef86a2d.slice/crio-fd2691aae6521c45040f9344f6d91ddad90e20cfd5d586ddeb2c5821d1336f0a WatchSource:0}: Error finding container fd2691aae6521c45040f9344f6d91ddad90e20cfd5d586ddeb2c5821d1336f0a: Status 404 returned error can't find the container with id fd2691aae6521c45040f9344f6d91ddad90e20cfd5d586ddeb2c5821d1336f0a Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.881006 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89rth"] Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.883559 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.885702 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.905379 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89rth"] Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.952369 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-utilities\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.952485 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-catalog-content\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:33 crc kubenswrapper[4926]: I0312 18:10:33.952577 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5zfs\" (UniqueName: \"kubernetes.io/projected/8ed429f6-2923-42c2-a3b5-402c3dff4858-kube-api-access-c5zfs\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.053484 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-utilities\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.053538 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-catalog-content\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.053590 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5zfs\" (UniqueName: \"kubernetes.io/projected/8ed429f6-2923-42c2-a3b5-402c3dff4858-kube-api-access-c5zfs\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.054004 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-utilities\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.054101 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-catalog-content\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.084665 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5zfs\" (UniqueName: \"kubernetes.io/projected/8ed429f6-2923-42c2-a3b5-402c3dff4858-kube-api-access-c5zfs\") pod \"community-operators-89rth\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.227514 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.271639 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672mz" event={"ID":"c9f79c44-e93f-48ba-9f2d-8a5b61a86089","Type":"ContainerStarted","Data":"7b03cdeb0a691f7942895be6b14554fa0f8ec4142145648b27f87e5596fa1020"} Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.273492 4926 generic.go:334] "Generic (PLEG): container finished" podID="0ef972e3-81c7-4b62-aa07-4939aef86a2d" containerID="cde3e1686aebfaaac97245815c4010a7e06082d4c42b0de785931985077be174" exitCode=0 Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.273651 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8lwn" event={"ID":"0ef972e3-81c7-4b62-aa07-4939aef86a2d","Type":"ContainerDied","Data":"cde3e1686aebfaaac97245815c4010a7e06082d4c42b0de785931985077be174"} Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.273969 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8lwn" event={"ID":"0ef972e3-81c7-4b62-aa07-4939aef86a2d","Type":"ContainerStarted","Data":"fd2691aae6521c45040f9344f6d91ddad90e20cfd5d586ddeb2c5821d1336f0a"} Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.276980 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-265z5" event={"ID":"0f746e64-bce3-4f58-b789-0f5573e28847","Type":"ContainerStarted","Data":"ed3558d1c2fe7aee056d002e16a5435263061c33364b778a5e945656d0caf11f"} Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.331627 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-265z5" podStartSLOduration=1.629136209 podStartE2EDuration="4.33160855s" podCreationTimestamp="2026-03-12 18:10:30 +0000 UTC" firstStartedPulling="2026-03-12 18:10:31.244385587 +0000 UTC m=+471.613011920" lastFinishedPulling="2026-03-12 18:10:33.946857918 +0000 UTC m=+474.315484261" observedRunningTime="2026-03-12 18:10:34.328866325 +0000 UTC m=+474.697492678" watchObservedRunningTime="2026-03-12 18:10:34.33160855 +0000 UTC m=+474.700234883" Mar 12 18:10:34 crc kubenswrapper[4926]: I0312 18:10:34.426790 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89rth"] Mar 12 18:10:35 crc kubenswrapper[4926]: I0312 18:10:35.292096 4926 generic.go:334] "Generic (PLEG): container finished" podID="c9f79c44-e93f-48ba-9f2d-8a5b61a86089" containerID="7b03cdeb0a691f7942895be6b14554fa0f8ec4142145648b27f87e5596fa1020" exitCode=0 Mar 12 18:10:35 crc kubenswrapper[4926]: I0312 18:10:35.292265 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672mz" event={"ID":"c9f79c44-e93f-48ba-9f2d-8a5b61a86089","Type":"ContainerDied","Data":"7b03cdeb0a691f7942895be6b14554fa0f8ec4142145648b27f87e5596fa1020"} Mar 12 18:10:35 crc kubenswrapper[4926]: I0312 18:10:35.310197 4926 generic.go:334] "Generic (PLEG): container finished" podID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerID="b4a2b0d2b334eaa88bb6c32b357510979e7382c6582dd687f2155b5f20b07067" exitCode=0 Mar 12 18:10:35 crc kubenswrapper[4926]: I0312 18:10:35.311181 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerDied","Data":"b4a2b0d2b334eaa88bb6c32b357510979e7382c6582dd687f2155b5f20b07067"} Mar 12 18:10:35 crc kubenswrapper[4926]: I0312 18:10:35.311457 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerStarted","Data":"75b5fa3c244e1a8120e460f27209d8d4bc13d27e6ea5e2efce3f0acb11f95356"} Mar 12 18:10:36 crc kubenswrapper[4926]: I0312 18:10:36.318542 4926 generic.go:334] "Generic (PLEG): container finished" podID="0ef972e3-81c7-4b62-aa07-4939aef86a2d" containerID="9986cda75fb5c1cbdafdcc2f593e2eca5328cbc3ead61fc97e07ee3b4410da7d" exitCode=0 Mar 12 18:10:36 crc kubenswrapper[4926]: I0312 18:10:36.318660 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8lwn" event={"ID":"0ef972e3-81c7-4b62-aa07-4939aef86a2d","Type":"ContainerDied","Data":"9986cda75fb5c1cbdafdcc2f593e2eca5328cbc3ead61fc97e07ee3b4410da7d"} Mar 12 18:10:36 crc kubenswrapper[4926]: I0312 18:10:36.323207 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-672mz" event={"ID":"c9f79c44-e93f-48ba-9f2d-8a5b61a86089","Type":"ContainerStarted","Data":"e5916d747c43eec52f6e9fdc5fce1e305ff6c9af8aaa450a2ea24c339f9fdd70"} Mar 12 18:10:36 crc kubenswrapper[4926]: I0312 18:10:36.356294 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-672mz" podStartSLOduration=2.785500238 podStartE2EDuration="5.356278331s" podCreationTimestamp="2026-03-12 18:10:31 +0000 UTC" firstStartedPulling="2026-03-12 18:10:33.259842591 +0000 UTC m=+473.628468924" lastFinishedPulling="2026-03-12 18:10:35.830620643 +0000 UTC m=+476.199247017" observedRunningTime="2026-03-12 18:10:36.355462695 +0000 UTC m=+476.724089038" watchObservedRunningTime="2026-03-12 18:10:36.356278331 +0000 UTC m=+476.724904664" Mar 12 18:10:37 crc kubenswrapper[4926]: I0312 18:10:37.330580 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8lwn" event={"ID":"0ef972e3-81c7-4b62-aa07-4939aef86a2d","Type":"ContainerStarted","Data":"0b0384766172a95d6a3558b13f8ff146e41e28421d2d906ff78bcb5fc47029c5"} Mar 12 18:10:37 crc kubenswrapper[4926]: I0312 18:10:37.332949 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerStarted","Data":"2efb7a52879f66525ffb82d5cb820bfe29543aedd6332a889b48b179b150edf8"} Mar 12 18:10:37 crc kubenswrapper[4926]: I0312 18:10:37.362865 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8lwn" podStartSLOduration=2.519301066 podStartE2EDuration="5.362849807s" podCreationTimestamp="2026-03-12 18:10:32 +0000 UTC" firstStartedPulling="2026-03-12 18:10:34.274677983 +0000 UTC m=+474.643304316" lastFinishedPulling="2026-03-12 18:10:37.118226684 +0000 UTC m=+477.486853057" observedRunningTime="2026-03-12 18:10:37.362595789 +0000 UTC m=+477.731222132" watchObservedRunningTime="2026-03-12 18:10:37.362849807 +0000 UTC m=+477.731476140" Mar 12 18:10:38 crc kubenswrapper[4926]: I0312 18:10:38.341227 4926 generic.go:334] "Generic (PLEG): container finished" podID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerID="2efb7a52879f66525ffb82d5cb820bfe29543aedd6332a889b48b179b150edf8" exitCode=0 Mar 12 18:10:38 crc kubenswrapper[4926]: I0312 18:10:38.342830 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerDied","Data":"2efb7a52879f66525ffb82d5cb820bfe29543aedd6332a889b48b179b150edf8"} Mar 12 18:10:39 crc kubenswrapper[4926]: I0312 18:10:39.349365 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerStarted","Data":"b6bb1201cf8704c1c5218ae5ad6eb53e93f35257e13f18d1b638bfe3b2da6322"} Mar 12 18:10:39 crc kubenswrapper[4926]: I0312 18:10:39.376197 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89rth" podStartSLOduration=2.839829219 podStartE2EDuration="6.376175715s" podCreationTimestamp="2026-03-12 18:10:33 +0000 UTC" firstStartedPulling="2026-03-12 18:10:35.311691414 +0000 UTC m=+475.680317757" lastFinishedPulling="2026-03-12 18:10:38.84803791 +0000 UTC m=+479.216664253" observedRunningTime="2026-03-12 18:10:39.370575231 +0000 UTC m=+479.739201614" watchObservedRunningTime="2026-03-12 18:10:39.376175715 +0000 UTC m=+479.744802058" Mar 12 18:10:40 crc kubenswrapper[4926]: I0312 18:10:40.818193 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:40 crc kubenswrapper[4926]: I0312 18:10:40.818581 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:40 crc kubenswrapper[4926]: I0312 18:10:40.871808 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:41 crc kubenswrapper[4926]: I0312 18:10:41.405793 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-265z5" Mar 12 18:10:41 crc kubenswrapper[4926]: I0312 18:10:41.853040 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:41 crc kubenswrapper[4926]: I0312 18:10:41.853388 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:42 crc kubenswrapper[4926]: I0312 18:10:42.917878 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-672mz" podUID="c9f79c44-e93f-48ba-9f2d-8a5b61a86089" containerName="registry-server" probeResult="failure" output=< Mar 12 18:10:42 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:10:42 crc kubenswrapper[4926]: > Mar 12 18:10:43 crc kubenswrapper[4926]: I0312 18:10:43.227059 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:43 crc kubenswrapper[4926]: I0312 18:10:43.227119 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:43 crc kubenswrapper[4926]: I0312 18:10:43.281089 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:43 crc kubenswrapper[4926]: I0312 18:10:43.433259 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8lwn" Mar 12 18:10:44 crc kubenswrapper[4926]: I0312 18:10:44.228253 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:44 crc kubenswrapper[4926]: I0312 18:10:44.229019 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:44 crc kubenswrapper[4926]: I0312 18:10:44.266862 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:44 crc kubenswrapper[4926]: I0312 18:10:44.417615 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.250477 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" podUID="97b0faa2-bcb2-417e-9065-3156860a8644" containerName="registry" containerID="cri-o://f7c989898f3b81f561b1ebddb62bcb7b02ee6fac8bd401ab3a5ff75b154fda11" gracePeriod=30 Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.387473 4926 generic.go:334] "Generic (PLEG): container finished" podID="97b0faa2-bcb2-417e-9065-3156860a8644" containerID="f7c989898f3b81f561b1ebddb62bcb7b02ee6fac8bd401ab3a5ff75b154fda11" exitCode=0 Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.388128 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" event={"ID":"97b0faa2-bcb2-417e-9065-3156860a8644","Type":"ContainerDied","Data":"f7c989898f3b81f561b1ebddb62bcb7b02ee6fac8bd401ab3a5ff75b154fda11"} Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.850765 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.942746 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-registry-certificates\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.943075 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltlt7\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-kube-api-access-ltlt7\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.944191 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-bound-sa-token\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.944363 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-registry-tls\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.944502 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b0faa2-bcb2-417e-9065-3156860a8644-installation-pull-secrets\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.944749 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.944869 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-trusted-ca\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.944963 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b0faa2-bcb2-417e-9065-3156860a8644-ca-trust-extracted\") pod \"97b0faa2-bcb2-417e-9065-3156860a8644\" (UID: \"97b0faa2-bcb2-417e-9065-3156860a8644\") " Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.943634 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.947839 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.956292 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.956399 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97b0faa2-bcb2-417e-9065-3156860a8644-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.956564 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.960401 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.961548 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-kube-api-access-ltlt7" (OuterVolumeSpecName: "kube-api-access-ltlt7") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "kube-api-access-ltlt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:10:45 crc kubenswrapper[4926]: I0312 18:10:45.963800 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97b0faa2-bcb2-417e-9065-3156860a8644-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "97b0faa2-bcb2-417e-9065-3156860a8644" (UID: "97b0faa2-bcb2-417e-9065-3156860a8644"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046486 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046519 4926 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97b0faa2-bcb2-417e-9065-3156860a8644-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046532 4926 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97b0faa2-bcb2-417e-9065-3156860a8644-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046541 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltlt7\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-kube-api-access-ltlt7\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046549 4926 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046558 4926 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97b0faa2-bcb2-417e-9065-3156860a8644-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.046567 4926 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97b0faa2-bcb2-417e-9065-3156860a8644-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.394674 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.398542 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g6fzt" event={"ID":"97b0faa2-bcb2-417e-9065-3156860a8644","Type":"ContainerDied","Data":"5fd53c65057b16b866539f523a9ef035c2f73d7900692a8369fc22f3d2f8f197"} Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.398637 4926 scope.go:117] "RemoveContainer" containerID="f7c989898f3b81f561b1ebddb62bcb7b02ee6fac8bd401ab3a5ff75b154fda11" Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.432996 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6fzt"] Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.438132 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g6fzt"] Mar 12 18:10:46 crc kubenswrapper[4926]: I0312 18:10:46.497629 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b0faa2-bcb2-417e-9065-3156860a8644" path="/var/lib/kubelet/pods/97b0faa2-bcb2-417e-9065-3156860a8644/volumes" Mar 12 18:10:51 crc kubenswrapper[4926]: I0312 18:10:51.921285 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:51 crc kubenswrapper[4926]: I0312 18:10:51.988710 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-672mz" Mar 12 18:10:56 crc kubenswrapper[4926]: I0312 18:10:56.817643 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:10:56 crc kubenswrapper[4926]: I0312 18:10:56.818207 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:10:56 crc kubenswrapper[4926]: I0312 18:10:56.818257 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:10:57 crc kubenswrapper[4926]: I0312 18:10:57.459039 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3aa39c92eb410de660dbb6e5cf8c0dd506addf3c227d2622f913bd6b55014e2"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:10:57 crc kubenswrapper[4926]: I0312 18:10:57.459110 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://b3aa39c92eb410de660dbb6e5cf8c0dd506addf3c227d2622f913bd6b55014e2" gracePeriod=600 Mar 12 18:10:58 crc kubenswrapper[4926]: I0312 18:10:58.467464 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="b3aa39c92eb410de660dbb6e5cf8c0dd506addf3c227d2622f913bd6b55014e2" exitCode=0 Mar 12 18:10:58 crc kubenswrapper[4926]: I0312 18:10:58.467536 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"b3aa39c92eb410de660dbb6e5cf8c0dd506addf3c227d2622f913bd6b55014e2"} Mar 12 18:10:58 crc kubenswrapper[4926]: I0312 18:10:58.467900 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"869151c9e3071e8f72a54f977df4fbec55cdf81d3f75158a024a468d8d420c6b"} Mar 12 18:10:58 crc kubenswrapper[4926]: I0312 18:10:58.467919 4926 scope.go:117] "RemoveContainer" containerID="7d3bab13cabe4b82f90297599f822115d3fbb4c22873ec3b05761aca32e2caff" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.147734 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555652-mrcx4"] Mar 12 18:12:00 crc kubenswrapper[4926]: E0312 18:12:00.148518 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b0faa2-bcb2-417e-9065-3156860a8644" containerName="registry" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.148533 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b0faa2-bcb2-417e-9065-3156860a8644" containerName="registry" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.148645 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b0faa2-bcb2-417e-9065-3156860a8644" containerName="registry" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.149032 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.152272 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.152508 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.156793 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.160965 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555652-mrcx4"] Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.164691 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnjq\" (UniqueName: \"kubernetes.io/projected/a135018f-2c21-4678-a0fc-9d6b62dda2d6-kube-api-access-dsnjq\") pod \"auto-csr-approver-29555652-mrcx4\" (UID: \"a135018f-2c21-4678-a0fc-9d6b62dda2d6\") " pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.266384 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnjq\" (UniqueName: \"kubernetes.io/projected/a135018f-2c21-4678-a0fc-9d6b62dda2d6-kube-api-access-dsnjq\") pod \"auto-csr-approver-29555652-mrcx4\" (UID: \"a135018f-2c21-4678-a0fc-9d6b62dda2d6\") " pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.292245 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnjq\" (UniqueName: \"kubernetes.io/projected/a135018f-2c21-4678-a0fc-9d6b62dda2d6-kube-api-access-dsnjq\") pod \"auto-csr-approver-29555652-mrcx4\" (UID: \"a135018f-2c21-4678-a0fc-9d6b62dda2d6\") " pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.476955 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.731016 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555652-mrcx4"] Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.743641 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:12:00 crc kubenswrapper[4926]: I0312 18:12:00.892897 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" event={"ID":"a135018f-2c21-4678-a0fc-9d6b62dda2d6","Type":"ContainerStarted","Data":"cd090b4c4adefae3d508f3c966780859889876275ce95ed1f8a017bfc8d2b803"} Mar 12 18:12:02 crc kubenswrapper[4926]: I0312 18:12:02.910809 4926 generic.go:334] "Generic (PLEG): container finished" podID="a135018f-2c21-4678-a0fc-9d6b62dda2d6" containerID="80892424cda12377cff2432a54ca031087805d38670f04b3a339fc6f8678512c" exitCode=0 Mar 12 18:12:02 crc kubenswrapper[4926]: I0312 18:12:02.910878 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" event={"ID":"a135018f-2c21-4678-a0fc-9d6b62dda2d6","Type":"ContainerDied","Data":"80892424cda12377cff2432a54ca031087805d38670f04b3a339fc6f8678512c"} Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.284496 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.431285 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsnjq\" (UniqueName: \"kubernetes.io/projected/a135018f-2c21-4678-a0fc-9d6b62dda2d6-kube-api-access-dsnjq\") pod \"a135018f-2c21-4678-a0fc-9d6b62dda2d6\" (UID: \"a135018f-2c21-4678-a0fc-9d6b62dda2d6\") " Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.439742 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a135018f-2c21-4678-a0fc-9d6b62dda2d6-kube-api-access-dsnjq" (OuterVolumeSpecName: "kube-api-access-dsnjq") pod "a135018f-2c21-4678-a0fc-9d6b62dda2d6" (UID: "a135018f-2c21-4678-a0fc-9d6b62dda2d6"). InnerVolumeSpecName "kube-api-access-dsnjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.533428 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsnjq\" (UniqueName: \"kubernetes.io/projected/a135018f-2c21-4678-a0fc-9d6b62dda2d6-kube-api-access-dsnjq\") on node \"crc\" DevicePath \"\"" Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.927215 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" event={"ID":"a135018f-2c21-4678-a0fc-9d6b62dda2d6","Type":"ContainerDied","Data":"cd090b4c4adefae3d508f3c966780859889876275ce95ed1f8a017bfc8d2b803"} Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.927272 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd090b4c4adefae3d508f3c966780859889876275ce95ed1f8a017bfc8d2b803" Mar 12 18:12:04 crc kubenswrapper[4926]: I0312 18:12:04.927296 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555652-mrcx4" Mar 12 18:12:05 crc kubenswrapper[4926]: I0312 18:12:05.342283 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555646-wqpkb"] Mar 12 18:12:05 crc kubenswrapper[4926]: I0312 18:12:05.346491 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555646-wqpkb"] Mar 12 18:12:06 crc kubenswrapper[4926]: I0312 18:12:06.499644 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d68160cf-4e6c-4294-bfdc-4acb74637ecb" path="/var/lib/kubelet/pods/d68160cf-4e6c-4294-bfdc-4acb74637ecb/volumes" Mar 12 18:13:26 crc kubenswrapper[4926]: I0312 18:13:26.818121 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:13:26 crc kubenswrapper[4926]: I0312 18:13:26.819008 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:13:56 crc kubenswrapper[4926]: I0312 18:13:56.817939 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:13:56 crc kubenswrapper[4926]: I0312 18:13:56.818787 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.142260 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555654-jjvcf"] Mar 12 18:14:00 crc kubenswrapper[4926]: E0312 18:14:00.142838 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a135018f-2c21-4678-a0fc-9d6b62dda2d6" containerName="oc" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.142856 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a135018f-2c21-4678-a0fc-9d6b62dda2d6" containerName="oc" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.142970 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a135018f-2c21-4678-a0fc-9d6b62dda2d6" containerName="oc" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.143411 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.146564 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.147222 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.147345 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.151297 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555654-jjvcf"] Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.316121 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrbz\" (UniqueName: \"kubernetes.io/projected/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e-kube-api-access-bsrbz\") pod \"auto-csr-approver-29555654-jjvcf\" (UID: \"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e\") " pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.417247 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrbz\" (UniqueName: \"kubernetes.io/projected/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e-kube-api-access-bsrbz\") pod \"auto-csr-approver-29555654-jjvcf\" (UID: \"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e\") " pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.450837 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrbz\" (UniqueName: \"kubernetes.io/projected/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e-kube-api-access-bsrbz\") pod \"auto-csr-approver-29555654-jjvcf\" (UID: \"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e\") " pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.466323 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.754162 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555654-jjvcf"] Mar 12 18:14:00 crc kubenswrapper[4926]: I0312 18:14:00.956549 4926 scope.go:117] "RemoveContainer" containerID="52e8bc12e7903e1d736ebf550d285fa84f2e22f28d746c1ea7750000e16f3903" Mar 12 18:14:01 crc kubenswrapper[4926]: I0312 18:14:01.715539 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" event={"ID":"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e","Type":"ContainerStarted","Data":"89f51dfda28111729ff6b326d4927dd9f5c3592cb11d933b305b0ad0af84d6b4"} Mar 12 18:14:02 crc kubenswrapper[4926]: I0312 18:14:02.733267 4926 generic.go:334] "Generic (PLEG): container finished" podID="b3e47fc3-fa77-4992-b1f8-dfff8b4d924e" containerID="5d29d1a82fd23f75a9296b7891f008517b3e7945af33e9449e2dfcb52711dd8e" exitCode=0 Mar 12 18:14:02 crc kubenswrapper[4926]: I0312 18:14:02.733347 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" event={"ID":"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e","Type":"ContainerDied","Data":"5d29d1a82fd23f75a9296b7891f008517b3e7945af33e9449e2dfcb52711dd8e"} Mar 12 18:14:03 crc kubenswrapper[4926]: I0312 18:14:03.998196 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:04 crc kubenswrapper[4926]: I0312 18:14:04.172818 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsrbz\" (UniqueName: \"kubernetes.io/projected/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e-kube-api-access-bsrbz\") pod \"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e\" (UID: \"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e\") " Mar 12 18:14:04 crc kubenswrapper[4926]: I0312 18:14:04.182957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e-kube-api-access-bsrbz" (OuterVolumeSpecName: "kube-api-access-bsrbz") pod "b3e47fc3-fa77-4992-b1f8-dfff8b4d924e" (UID: "b3e47fc3-fa77-4992-b1f8-dfff8b4d924e"). InnerVolumeSpecName "kube-api-access-bsrbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:14:04 crc kubenswrapper[4926]: I0312 18:14:04.275137 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsrbz\" (UniqueName: \"kubernetes.io/projected/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e-kube-api-access-bsrbz\") on node \"crc\" DevicePath \"\"" Mar 12 18:14:04 crc kubenswrapper[4926]: I0312 18:14:04.751027 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" event={"ID":"b3e47fc3-fa77-4992-b1f8-dfff8b4d924e","Type":"ContainerDied","Data":"89f51dfda28111729ff6b326d4927dd9f5c3592cb11d933b305b0ad0af84d6b4"} Mar 12 18:14:04 crc kubenswrapper[4926]: I0312 18:14:04.751128 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89f51dfda28111729ff6b326d4927dd9f5c3592cb11d933b305b0ad0af84d6b4" Mar 12 18:14:04 crc kubenswrapper[4926]: I0312 18:14:04.751137 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555654-jjvcf" Mar 12 18:14:05 crc kubenswrapper[4926]: I0312 18:14:05.075753 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555648-j9n25"] Mar 12 18:14:05 crc kubenswrapper[4926]: I0312 18:14:05.082053 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555648-j9n25"] Mar 12 18:14:06 crc kubenswrapper[4926]: I0312 18:14:06.505695 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbfae2a-83a6-4586-a33c-b9bcfd4df092" path="/var/lib/kubelet/pods/8dbfae2a-83a6-4586-a33c-b9bcfd4df092/volumes" Mar 12 18:14:26 crc kubenswrapper[4926]: I0312 18:14:26.817274 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:14:26 crc kubenswrapper[4926]: I0312 18:14:26.817890 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:14:26 crc kubenswrapper[4926]: I0312 18:14:26.817965 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:14:26 crc kubenswrapper[4926]: I0312 18:14:26.818836 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"869151c9e3071e8f72a54f977df4fbec55cdf81d3f75158a024a468d8d420c6b"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:14:26 crc kubenswrapper[4926]: I0312 18:14:26.818948 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://869151c9e3071e8f72a54f977df4fbec55cdf81d3f75158a024a468d8d420c6b" gracePeriod=600 Mar 12 18:14:27 crc kubenswrapper[4926]: I0312 18:14:27.924349 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="869151c9e3071e8f72a54f977df4fbec55cdf81d3f75158a024a468d8d420c6b" exitCode=0 Mar 12 18:14:27 crc kubenswrapper[4926]: I0312 18:14:27.924508 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"869151c9e3071e8f72a54f977df4fbec55cdf81d3f75158a024a468d8d420c6b"} Mar 12 18:14:27 crc kubenswrapper[4926]: I0312 18:14:27.925140 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"a397bef079b1410b3294983dad25ada9109b1a0eac364c78c0ff4aeeccdf38ed"} Mar 12 18:14:27 crc kubenswrapper[4926]: I0312 18:14:27.925215 4926 scope.go:117] "RemoveContainer" containerID="b3aa39c92eb410de660dbb6e5cf8c0dd506addf3c227d2622f913bd6b55014e2" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.159061 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6"] Mar 12 18:15:00 crc kubenswrapper[4926]: E0312 18:15:00.160575 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e47fc3-fa77-4992-b1f8-dfff8b4d924e" containerName="oc" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.160608 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e47fc3-fa77-4992-b1f8-dfff8b4d924e" containerName="oc" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.160881 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e47fc3-fa77-4992-b1f8-dfff8b4d924e" containerName="oc" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.161868 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.168948 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.174100 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.176080 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6"] Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.257955 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69533265-f2d7-4566-a876-732db938b254-config-volume\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.258038 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69533265-f2d7-4566-a876-732db938b254-secret-volume\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.258253 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krf5x\" (UniqueName: \"kubernetes.io/projected/69533265-f2d7-4566-a876-732db938b254-kube-api-access-krf5x\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.359837 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69533265-f2d7-4566-a876-732db938b254-secret-volume\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.359984 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krf5x\" (UniqueName: \"kubernetes.io/projected/69533265-f2d7-4566-a876-732db938b254-kube-api-access-krf5x\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.360050 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69533265-f2d7-4566-a876-732db938b254-config-volume\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.361499 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69533265-f2d7-4566-a876-732db938b254-config-volume\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.368778 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69533265-f2d7-4566-a876-732db938b254-secret-volume\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.390029 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krf5x\" (UniqueName: \"kubernetes.io/projected/69533265-f2d7-4566-a876-732db938b254-kube-api-access-krf5x\") pod \"collect-profiles-29555655-pzwh6\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.495581 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:00 crc kubenswrapper[4926]: I0312 18:15:00.724829 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6"] Mar 12 18:15:01 crc kubenswrapper[4926]: I0312 18:15:01.023968 4926 scope.go:117] "RemoveContainer" containerID="57bd95669fead78f455d353e126bbd0157addbc0b0171c28ad86c1bdee789263" Mar 12 18:15:01 crc kubenswrapper[4926]: I0312 18:15:01.167520 4926 generic.go:334] "Generic (PLEG): container finished" podID="69533265-f2d7-4566-a876-732db938b254" containerID="faccc149e57f019c53b37ac265b109708c4a9cb3385ebb8ecda1f786761d99fb" exitCode=0 Mar 12 18:15:01 crc kubenswrapper[4926]: I0312 18:15:01.167566 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" event={"ID":"69533265-f2d7-4566-a876-732db938b254","Type":"ContainerDied","Data":"faccc149e57f019c53b37ac265b109708c4a9cb3385ebb8ecda1f786761d99fb"} Mar 12 18:15:01 crc kubenswrapper[4926]: I0312 18:15:01.167600 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" event={"ID":"69533265-f2d7-4566-a876-732db938b254","Type":"ContainerStarted","Data":"c122a8e907362d7f0e0eb0123d6829139638732723b911d7f545569fdcbc25b6"} Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.467353 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.490373 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krf5x\" (UniqueName: \"kubernetes.io/projected/69533265-f2d7-4566-a876-732db938b254-kube-api-access-krf5x\") pod \"69533265-f2d7-4566-a876-732db938b254\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.490427 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69533265-f2d7-4566-a876-732db938b254-config-volume\") pod \"69533265-f2d7-4566-a876-732db938b254\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.490526 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69533265-f2d7-4566-a876-732db938b254-secret-volume\") pod \"69533265-f2d7-4566-a876-732db938b254\" (UID: \"69533265-f2d7-4566-a876-732db938b254\") " Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.491288 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69533265-f2d7-4566-a876-732db938b254-config-volume" (OuterVolumeSpecName: "config-volume") pod "69533265-f2d7-4566-a876-732db938b254" (UID: "69533265-f2d7-4566-a876-732db938b254"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.497645 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69533265-f2d7-4566-a876-732db938b254-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "69533265-f2d7-4566-a876-732db938b254" (UID: "69533265-f2d7-4566-a876-732db938b254"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.499112 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69533265-f2d7-4566-a876-732db938b254-kube-api-access-krf5x" (OuterVolumeSpecName: "kube-api-access-krf5x") pod "69533265-f2d7-4566-a876-732db938b254" (UID: "69533265-f2d7-4566-a876-732db938b254"). InnerVolumeSpecName "kube-api-access-krf5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.592223 4926 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69533265-f2d7-4566-a876-732db938b254-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.592267 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krf5x\" (UniqueName: \"kubernetes.io/projected/69533265-f2d7-4566-a876-732db938b254-kube-api-access-krf5x\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:02 crc kubenswrapper[4926]: I0312 18:15:02.592280 4926 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69533265-f2d7-4566-a876-732db938b254-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:03 crc kubenswrapper[4926]: I0312 18:15:03.198801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" event={"ID":"69533265-f2d7-4566-a876-732db938b254","Type":"ContainerDied","Data":"c122a8e907362d7f0e0eb0123d6829139638732723b911d7f545569fdcbc25b6"} Mar 12 18:15:03 crc kubenswrapper[4926]: I0312 18:15:03.199298 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c122a8e907362d7f0e0eb0123d6829139638732723b911d7f545569fdcbc25b6" Mar 12 18:15:03 crc kubenswrapper[4926]: I0312 18:15:03.198884 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555655-pzwh6" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.282497 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc"] Mar 12 18:15:43 crc kubenswrapper[4926]: E0312 18:15:43.283197 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69533265-f2d7-4566-a876-732db938b254" containerName="collect-profiles" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.283211 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="69533265-f2d7-4566-a876-732db938b254" containerName="collect-profiles" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.283325 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="69533265-f2d7-4566-a876-732db938b254" containerName="collect-profiles" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.285988 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.292506 4926 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-spdvk" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.292643 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.292758 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.295084 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc"] Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.303746 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7pf7h"] Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.304531 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7pf7h" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.321004 4926 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dqkb8" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.352792 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5kmk\" (UniqueName: \"kubernetes.io/projected/04933adf-efe6-4d54-8575-cc5c4069ea9a-kube-api-access-n5kmk\") pod \"cert-manager-858654f9db-7pf7h\" (UID: \"04933adf-efe6-4d54-8575-cc5c4069ea9a\") " pod="cert-manager/cert-manager-858654f9db-7pf7h" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.353129 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzx22\" (UniqueName: \"kubernetes.io/projected/eaa86db1-fe85-4b00-b8e0-c61cb013f52d-kube-api-access-pzx22\") pod \"cert-manager-cainjector-cf98fcc89-mbhfc\" (UID: \"eaa86db1-fe85-4b00-b8e0-c61cb013f52d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.355412 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p8p7c"] Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.358131 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.359966 4926 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tdvf8" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.365016 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7pf7h"] Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.368928 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p8p7c"] Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.453855 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzx22\" (UniqueName: \"kubernetes.io/projected/eaa86db1-fe85-4b00-b8e0-c61cb013f52d-kube-api-access-pzx22\") pod \"cert-manager-cainjector-cf98fcc89-mbhfc\" (UID: \"eaa86db1-fe85-4b00-b8e0-c61cb013f52d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.453944 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5kmk\" (UniqueName: \"kubernetes.io/projected/04933adf-efe6-4d54-8575-cc5c4069ea9a-kube-api-access-n5kmk\") pod \"cert-manager-858654f9db-7pf7h\" (UID: \"04933adf-efe6-4d54-8575-cc5c4069ea9a\") " pod="cert-manager/cert-manager-858654f9db-7pf7h" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.477447 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5kmk\" (UniqueName: \"kubernetes.io/projected/04933adf-efe6-4d54-8575-cc5c4069ea9a-kube-api-access-n5kmk\") pod \"cert-manager-858654f9db-7pf7h\" (UID: \"04933adf-efe6-4d54-8575-cc5c4069ea9a\") " pod="cert-manager/cert-manager-858654f9db-7pf7h" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.481624 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzx22\" (UniqueName: \"kubernetes.io/projected/eaa86db1-fe85-4b00-b8e0-c61cb013f52d-kube-api-access-pzx22\") pod \"cert-manager-cainjector-cf98fcc89-mbhfc\" (UID: \"eaa86db1-fe85-4b00-b8e0-c61cb013f52d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.555138 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86966\" (UniqueName: \"kubernetes.io/projected/ef14eb59-d30a-437c-80d1-70513a544b2d-kube-api-access-86966\") pod \"cert-manager-webhook-687f57d79b-p8p7c\" (UID: \"ef14eb59-d30a-437c-80d1-70513a544b2d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.605685 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.647267 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7pf7h" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.656726 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86966\" (UniqueName: \"kubernetes.io/projected/ef14eb59-d30a-437c-80d1-70513a544b2d-kube-api-access-86966\") pod \"cert-manager-webhook-687f57d79b-p8p7c\" (UID: \"ef14eb59-d30a-437c-80d1-70513a544b2d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.680154 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86966\" (UniqueName: \"kubernetes.io/projected/ef14eb59-d30a-437c-80d1-70513a544b2d-kube-api-access-86966\") pod \"cert-manager-webhook-687f57d79b-p8p7c\" (UID: \"ef14eb59-d30a-437c-80d1-70513a544b2d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.841394 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc"] Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.889655 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7pf7h"] Mar 12 18:15:43 crc kubenswrapper[4926]: W0312 18:15:43.895524 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04933adf_efe6_4d54_8575_cc5c4069ea9a.slice/crio-7d3bd66ee4d91c13c77673eccd2270176a8715c827c63ca1ddcfab990f1f37f1 WatchSource:0}: Error finding container 7d3bd66ee4d91c13c77673eccd2270176a8715c827c63ca1ddcfab990f1f37f1: Status 404 returned error can't find the container with id 7d3bd66ee4d91c13c77673eccd2270176a8715c827c63ca1ddcfab990f1f37f1 Mar 12 18:15:43 crc kubenswrapper[4926]: I0312 18:15:43.974149 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:44 crc kubenswrapper[4926]: I0312 18:15:44.173904 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p8p7c"] Mar 12 18:15:44 crc kubenswrapper[4926]: W0312 18:15:44.180632 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef14eb59_d30a_437c_80d1_70513a544b2d.slice/crio-820bc38aea3d9c60d3d68cfc4b4fb00ab5ec20a971c76a654f49c505bd143284 WatchSource:0}: Error finding container 820bc38aea3d9c60d3d68cfc4b4fb00ab5ec20a971c76a654f49c505bd143284: Status 404 returned error can't find the container with id 820bc38aea3d9c60d3d68cfc4b4fb00ab5ec20a971c76a654f49c505bd143284 Mar 12 18:15:44 crc kubenswrapper[4926]: I0312 18:15:44.506799 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" event={"ID":"ef14eb59-d30a-437c-80d1-70513a544b2d","Type":"ContainerStarted","Data":"820bc38aea3d9c60d3d68cfc4b4fb00ab5ec20a971c76a654f49c505bd143284"} Mar 12 18:15:44 crc kubenswrapper[4926]: I0312 18:15:44.506878 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" event={"ID":"eaa86db1-fe85-4b00-b8e0-c61cb013f52d","Type":"ContainerStarted","Data":"2497c20f07fed49993dc459fd6d32fb2a0a69d28ec2ce1932b80d92a25e239f2"} Mar 12 18:15:44 crc kubenswrapper[4926]: I0312 18:15:44.507106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7pf7h" event={"ID":"04933adf-efe6-4d54-8575-cc5c4069ea9a","Type":"ContainerStarted","Data":"7d3bd66ee4d91c13c77673eccd2270176a8715c827c63ca1ddcfab990f1f37f1"} Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.533686 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7pf7h" event={"ID":"04933adf-efe6-4d54-8575-cc5c4069ea9a","Type":"ContainerStarted","Data":"75169479de569e14a862d459dfadcad2bbda4900c5eadc77a05da06117ce0055"} Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.536339 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" event={"ID":"ef14eb59-d30a-437c-80d1-70513a544b2d","Type":"ContainerStarted","Data":"e2e67a1a1d0b57520b59b6c5c98db19e3f96b7bcd647c2b70927945105dc4008"} Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.536554 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.538965 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" event={"ID":"eaa86db1-fe85-4b00-b8e0-c61cb013f52d","Type":"ContainerStarted","Data":"70cd182f90e9394d2f17d256e0e0e5c9b125bc76c12c13592271a70812214135"} Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.564718 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7pf7h" podStartSLOduration=1.7459865749999999 podStartE2EDuration="5.564690053s" podCreationTimestamp="2026-03-12 18:15:43 +0000 UTC" firstStartedPulling="2026-03-12 18:15:43.897720114 +0000 UTC m=+784.266346447" lastFinishedPulling="2026-03-12 18:15:47.716423592 +0000 UTC m=+788.085049925" observedRunningTime="2026-03-12 18:15:48.557596392 +0000 UTC m=+788.926222785" watchObservedRunningTime="2026-03-12 18:15:48.564690053 +0000 UTC m=+788.933316426" Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.577923 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" podStartSLOduration=2.104374805 podStartE2EDuration="5.577898694s" podCreationTimestamp="2026-03-12 18:15:43 +0000 UTC" firstStartedPulling="2026-03-12 18:15:44.184234647 +0000 UTC m=+784.552860980" lastFinishedPulling="2026-03-12 18:15:47.657758536 +0000 UTC m=+788.026384869" observedRunningTime="2026-03-12 18:15:48.574238589 +0000 UTC m=+788.942864992" watchObservedRunningTime="2026-03-12 18:15:48.577898694 +0000 UTC m=+788.946525057" Mar 12 18:15:48 crc kubenswrapper[4926]: I0312 18:15:48.590299 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mbhfc" podStartSLOduration=1.78312085 podStartE2EDuration="5.590275669s" podCreationTimestamp="2026-03-12 18:15:43 +0000 UTC" firstStartedPulling="2026-03-12 18:15:43.851144574 +0000 UTC m=+784.219770907" lastFinishedPulling="2026-03-12 18:15:47.658299393 +0000 UTC m=+788.026925726" observedRunningTime="2026-03-12 18:15:48.587865264 +0000 UTC m=+788.956491627" watchObservedRunningTime="2026-03-12 18:15:48.590275669 +0000 UTC m=+788.958902042" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.468586 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlfmg"] Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.469977 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-controller" containerID="cri-o://c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.470063 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="nbdb" containerID="cri-o://c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.470128 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.470196 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-node" containerID="cri-o://d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.470178 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="sbdb" containerID="cri-o://0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.470294 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="northd" containerID="cri-o://88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.470304 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-acl-logging" containerID="cri-o://1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.534011 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" containerID="cri-o://a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" gracePeriod=30 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.620342 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/3.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.630614 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovn-acl-logging/0.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.631080 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovn-controller/0.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.631419 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" exitCode=0 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.631476 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" exitCode=143 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.631524 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.631554 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.634031 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/2.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.635059 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/1.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.635104 4926 generic.go:334] "Generic (PLEG): container finished" podID="d5a53ef4-c701-457f-9cf2-85819bf04d1a" containerID="2c510c831017eeb7aed88601040d4790cf6d5bebce7b09f246c92ea9b81e2481" exitCode=2 Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.635141 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerDied","Data":"2c510c831017eeb7aed88601040d4790cf6d5bebce7b09f246c92ea9b81e2481"} Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.635178 4926 scope.go:117] "RemoveContainer" containerID="bb6d6fe627e09e41640be0175c96b2d983a5b9f7b7e50c1792cfda71adaf2cf4" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.635697 4926 scope.go:117] "RemoveContainer" containerID="2c510c831017eeb7aed88601040d4790cf6d5bebce7b09f246c92ea9b81e2481" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.635967 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xwqvl_openshift-multus(d5a53ef4-c701-457f-9cf2-85819bf04d1a)\"" pod="openshift-multus/multus-xwqvl" podUID="d5a53ef4-c701-457f-9cf2-85819bf04d1a" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.761327 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/3.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.764214 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovn-acl-logging/0.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.764787 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovn-controller/0.log" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.765264 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832426 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-94d7p"] Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.832775 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832804 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.832823 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832836 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.832879 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-node" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832894 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-node" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.832911 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="nbdb" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832922 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="nbdb" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.832944 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="northd" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832956 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="northd" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.832974 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-acl-logging" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.832987 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-acl-logging" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833006 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833020 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833035 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kubecfg-setup" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833047 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kubecfg-setup" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833065 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="sbdb" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833076 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="sbdb" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833097 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833109 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833126 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833138 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833304 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="northd" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833319 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833334 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-node" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833352 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="sbdb" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833365 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-acl-logging" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833379 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833396 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833415 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833429 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833493 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="nbdb" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833508 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovn-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833686 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833701 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: E0312 18:15:53.833718 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833731 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.833910 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerName="ovnkube-controller" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.836947 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916509 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovn-node-metrics-cert\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916586 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-script-lib\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916619 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-var-lib-openvswitch\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916646 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-kubelet\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916667 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-ovn-kubernetes\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916694 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5dd\" (UniqueName: \"kubernetes.io/projected/bc33af41-5aa0-4254-ac75-69433d5f4ce9-kube-api-access-4t5dd\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916718 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-openvswitch\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916736 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-ovn\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916757 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-log-socket\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916797 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916816 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-node-log\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916834 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-etc-openvswitch\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916856 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-slash\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916885 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-env-overrides\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916904 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-systemd-units\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916939 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-bin\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.916984 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-config\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917005 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-netd\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917028 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-systemd\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917046 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-netns\") pod \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\" (UID: \"bc33af41-5aa0-4254-ac75-69433d5f4ce9\") " Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917283 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917322 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-node-log" (OuterVolumeSpecName: "node-log") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917345 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917344 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917366 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-slash" (OuterVolumeSpecName: "host-slash") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917775 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917832 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.917856 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918080 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918115 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918133 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918150 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918149 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918167 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918180 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918236 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.918282 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-log-socket" (OuterVolumeSpecName: "log-socket") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.925104 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.925120 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc33af41-5aa0-4254-ac75-69433d5f4ce9-kube-api-access-4t5dd" (OuterVolumeSpecName: "kube-api-access-4t5dd") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "kube-api-access-4t5dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.932273 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "bc33af41-5aa0-4254-ac75-69433d5f4ce9" (UID: "bc33af41-5aa0-4254-ac75-69433d5f4ce9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:15:53 crc kubenswrapper[4926]: I0312 18:15:53.977648 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-p8p7c" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018429 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-etc-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018524 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-systemd\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018576 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-cni-bin\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018615 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-slash\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018659 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-cni-netd\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018695 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-systemd-units\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018820 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-env-overrides\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018871 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.018994 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovn-node-metrics-cert\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019078 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019106 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-node-log\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019168 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-var-lib-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019249 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovnkube-config\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019265 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-kubelet\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019292 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-run-ovn-kubernetes\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019324 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llt9q\" (UniqueName: \"kubernetes.io/projected/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-kube-api-access-llt9q\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019343 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-run-netns\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019744 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-ovn\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019815 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-log-socket\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.019871 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovnkube-script-lib\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020075 4926 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020108 4926 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020131 4926 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020154 4926 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020176 4926 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020200 4926 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020229 4926 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020255 4926 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020280 4926 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020304 4926 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020322 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5dd\" (UniqueName: \"kubernetes.io/projected/bc33af41-5aa0-4254-ac75-69433d5f4ce9-kube-api-access-4t5dd\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020341 4926 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020359 4926 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020380 4926 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-log-socket\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020400 4926 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020420 4926 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-node-log\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020481 4926 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020500 4926 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-host-slash\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020519 4926 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bc33af41-5aa0-4254-ac75-69433d5f4ce9-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.020538 4926 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bc33af41-5aa0-4254-ac75-69433d5f4ce9-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121500 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-cni-bin\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121561 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-slash\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121611 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-cni-netd\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121652 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-systemd-units\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121684 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-env-overrides\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121721 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121720 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-slash\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121752 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovn-node-metrics-cert\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121779 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-cni-netd\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121891 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121920 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121927 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-systemd-units\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121973 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121965 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-node-log\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.121939 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-cni-bin\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122018 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-node-log\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122094 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-var-lib-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122170 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovnkube-config\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122202 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-kubelet\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122226 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-var-lib-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122235 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-run-netns\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-run-netns\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122284 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-run-ovn-kubernetes\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122314 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-kubelet\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122318 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llt9q\" (UniqueName: \"kubernetes.io/projected/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-kube-api-access-llt9q\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122390 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-ovn\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122435 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-log-socket\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122509 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovnkube-script-lib\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122571 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-etc-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122605 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-systemd\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122742 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-systemd\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122756 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-host-run-ovn-kubernetes\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122785 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-run-ovn\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122843 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-etc-openvswitch\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.122931 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-log-socket\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.123168 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovnkube-config\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.123514 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-env-overrides\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.123990 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovnkube-script-lib\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.126617 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-ovn-node-metrics-cert\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.152585 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llt9q\" (UniqueName: \"kubernetes.io/projected/2054c0e9-f3e6-4d68-ab8f-57466f8b3e47-kube-api-access-llt9q\") pod \"ovnkube-node-94d7p\" (UID: \"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47\") " pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.163409 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.644289 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/2.log" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.646759 4926 generic.go:334] "Generic (PLEG): container finished" podID="2054c0e9-f3e6-4d68-ab8f-57466f8b3e47" containerID="0017057d393b7fac29161022acddc46b04957b0c846ef899e05ba04301738f66" exitCode=0 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.646892 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerDied","Data":"0017057d393b7fac29161022acddc46b04957b0c846ef899e05ba04301738f66"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.646996 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"33c8027e2c14712760adfead899aee80df4c4ebc8bcac548b0a119c466eef8a2"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.650805 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovnkube-controller/3.log" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.656770 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovn-acl-logging/0.log" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.657722 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlfmg_bc33af41-5aa0-4254-ac75-69433d5f4ce9/ovn-controller/0.log" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.658698 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.658705 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.658807 4926 scope.go:117] "RemoveContainer" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.658555 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" exitCode=0 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660111 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" exitCode=0 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660138 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" exitCode=0 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660160 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" exitCode=0 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660180 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" exitCode=0 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660199 4926 generic.go:334] "Generic (PLEG): container finished" podID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" exitCode=143 Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660240 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660352 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660376 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660396 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660416 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660467 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660487 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660499 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660510 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660521 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660532 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660543 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660553 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660564 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660579 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlfmg" event={"ID":"bc33af41-5aa0-4254-ac75-69433d5f4ce9","Type":"ContainerDied","Data":"30c6b754be03f1fd88325819e9237821f35a5d4ca5f0ad8545574e86e65cadf9"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660598 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660612 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660623 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660641 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660652 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660663 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660674 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660684 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660695 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.660705 4926 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.685780 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.744688 4926 scope.go:117] "RemoveContainer" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.764640 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlfmg"] Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.769212 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zlfmg"] Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.778803 4926 scope.go:117] "RemoveContainer" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.796647 4926 scope.go:117] "RemoveContainer" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.813147 4926 scope.go:117] "RemoveContainer" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.840693 4926 scope.go:117] "RemoveContainer" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.870626 4926 scope.go:117] "RemoveContainer" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.887265 4926 scope.go:117] "RemoveContainer" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.920655 4926 scope.go:117] "RemoveContainer" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.951323 4926 scope.go:117] "RemoveContainer" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.951952 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": container with ID starting with a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d not found: ID does not exist" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.952023 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} err="failed to get container status \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": rpc error: code = NotFound desc = could not find container \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": container with ID starting with a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.952067 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.952516 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": container with ID starting with 4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d not found: ID does not exist" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.952571 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} err="failed to get container status \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": rpc error: code = NotFound desc = could not find container \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": container with ID starting with 4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.952608 4926 scope.go:117] "RemoveContainer" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.952947 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": container with ID starting with 0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4 not found: ID does not exist" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.952984 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} err="failed to get container status \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": rpc error: code = NotFound desc = could not find container \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": container with ID starting with 0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.953008 4926 scope.go:117] "RemoveContainer" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.953522 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": container with ID starting with c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9 not found: ID does not exist" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.953565 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} err="failed to get container status \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": rpc error: code = NotFound desc = could not find container \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": container with ID starting with c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.953592 4926 scope.go:117] "RemoveContainer" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.953994 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": container with ID starting with 88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5 not found: ID does not exist" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.954044 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} err="failed to get container status \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": rpc error: code = NotFound desc = could not find container \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": container with ID starting with 88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.954078 4926 scope.go:117] "RemoveContainer" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.954410 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": container with ID starting with 6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d not found: ID does not exist" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.954477 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} err="failed to get container status \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": rpc error: code = NotFound desc = could not find container \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": container with ID starting with 6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.954504 4926 scope.go:117] "RemoveContainer" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.954823 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": container with ID starting with d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a not found: ID does not exist" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.954854 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} err="failed to get container status \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": rpc error: code = NotFound desc = could not find container \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": container with ID starting with d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.954871 4926 scope.go:117] "RemoveContainer" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.955145 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": container with ID starting with 1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171 not found: ID does not exist" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.955186 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} err="failed to get container status \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": rpc error: code = NotFound desc = could not find container \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": container with ID starting with 1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.955211 4926 scope.go:117] "RemoveContainer" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.955631 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": container with ID starting with c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2 not found: ID does not exist" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.955658 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} err="failed to get container status \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": rpc error: code = NotFound desc = could not find container \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": container with ID starting with c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.955677 4926 scope.go:117] "RemoveContainer" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" Mar 12 18:15:54 crc kubenswrapper[4926]: E0312 18:15:54.956039 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": container with ID starting with 5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe not found: ID does not exist" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.956088 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} err="failed to get container status \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": rpc error: code = NotFound desc = could not find container \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": container with ID starting with 5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.956116 4926 scope.go:117] "RemoveContainer" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.956425 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} err="failed to get container status \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": rpc error: code = NotFound desc = could not find container \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": container with ID starting with a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.956506 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.956959 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} err="failed to get container status \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": rpc error: code = NotFound desc = could not find container \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": container with ID starting with 4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.956997 4926 scope.go:117] "RemoveContainer" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.957546 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} err="failed to get container status \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": rpc error: code = NotFound desc = could not find container \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": container with ID starting with 0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.957574 4926 scope.go:117] "RemoveContainer" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.957902 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} err="failed to get container status \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": rpc error: code = NotFound desc = could not find container \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": container with ID starting with c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.957943 4926 scope.go:117] "RemoveContainer" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.958326 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} err="failed to get container status \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": rpc error: code = NotFound desc = could not find container \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": container with ID starting with 88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.958352 4926 scope.go:117] "RemoveContainer" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.958711 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} err="failed to get container status \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": rpc error: code = NotFound desc = could not find container \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": container with ID starting with 6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.958747 4926 scope.go:117] "RemoveContainer" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.959064 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} err="failed to get container status \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": rpc error: code = NotFound desc = could not find container \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": container with ID starting with d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.959085 4926 scope.go:117] "RemoveContainer" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.959380 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} err="failed to get container status \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": rpc error: code = NotFound desc = could not find container \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": container with ID starting with 1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.959416 4926 scope.go:117] "RemoveContainer" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.959970 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} err="failed to get container status \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": rpc error: code = NotFound desc = could not find container \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": container with ID starting with c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.959999 4926 scope.go:117] "RemoveContainer" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.960340 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} err="failed to get container status \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": rpc error: code = NotFound desc = could not find container \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": container with ID starting with 5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.960398 4926 scope.go:117] "RemoveContainer" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.961019 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} err="failed to get container status \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": rpc error: code = NotFound desc = could not find container \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": container with ID starting with a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.961060 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.961386 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} err="failed to get container status \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": rpc error: code = NotFound desc = could not find container \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": container with ID starting with 4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.961407 4926 scope.go:117] "RemoveContainer" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.961916 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} err="failed to get container status \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": rpc error: code = NotFound desc = could not find container \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": container with ID starting with 0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.961960 4926 scope.go:117] "RemoveContainer" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.962492 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} err="failed to get container status \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": rpc error: code = NotFound desc = could not find container \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": container with ID starting with c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.962518 4926 scope.go:117] "RemoveContainer" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.962953 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} err="failed to get container status \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": rpc error: code = NotFound desc = could not find container \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": container with ID starting with 88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.962995 4926 scope.go:117] "RemoveContainer" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.963328 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} err="failed to get container status \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": rpc error: code = NotFound desc = could not find container \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": container with ID starting with 6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.963369 4926 scope.go:117] "RemoveContainer" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.963706 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} err="failed to get container status \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": rpc error: code = NotFound desc = could not find container \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": container with ID starting with d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.963732 4926 scope.go:117] "RemoveContainer" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.964025 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} err="failed to get container status \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": rpc error: code = NotFound desc = could not find container \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": container with ID starting with 1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.964055 4926 scope.go:117] "RemoveContainer" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.964394 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} err="failed to get container status \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": rpc error: code = NotFound desc = could not find container \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": container with ID starting with c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.964418 4926 scope.go:117] "RemoveContainer" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.964744 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} err="failed to get container status \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": rpc error: code = NotFound desc = could not find container \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": container with ID starting with 5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.964769 4926 scope.go:117] "RemoveContainer" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.965056 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} err="failed to get container status \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": rpc error: code = NotFound desc = could not find container \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": container with ID starting with a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.965082 4926 scope.go:117] "RemoveContainer" containerID="4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.965385 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d"} err="failed to get container status \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": rpc error: code = NotFound desc = could not find container \"4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d\": container with ID starting with 4cefeacdf01b18198adfc4e477f52aa244393d6d3a5bdc1c6910c26c5054b51d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.965410 4926 scope.go:117] "RemoveContainer" containerID="0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.965831 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4"} err="failed to get container status \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": rpc error: code = NotFound desc = could not find container \"0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4\": container with ID starting with 0e7256eaa446c53af7a1b21e76d692293ff3adb9f835bd6b4d3ff47074e38ae4 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.965889 4926 scope.go:117] "RemoveContainer" containerID="c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.966361 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9"} err="failed to get container status \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": rpc error: code = NotFound desc = could not find container \"c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9\": container with ID starting with c6b99cf03ba8e56bb4a9fe6cd9590db23fcf485cc69a869191ed57d04675c8d9 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.966388 4926 scope.go:117] "RemoveContainer" containerID="88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.966687 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5"} err="failed to get container status \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": rpc error: code = NotFound desc = could not find container \"88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5\": container with ID starting with 88cb63d97c58ddb777833f105b5f3ec78639266b637e061e6da8633f850da6c5 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.966728 4926 scope.go:117] "RemoveContainer" containerID="6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.967064 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d"} err="failed to get container status \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": rpc error: code = NotFound desc = could not find container \"6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d\": container with ID starting with 6265e0fe77312fcdcf7e870ecba7e55a10f1ba96ca2e0b405a377ed75013bc4d not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.967089 4926 scope.go:117] "RemoveContainer" containerID="d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.967503 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a"} err="failed to get container status \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": rpc error: code = NotFound desc = could not find container \"d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a\": container with ID starting with d9ffae7f86cae4692b30a6bfc1d646f6fdb53cd94499e5d9aa9abd106b13973a not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.967530 4926 scope.go:117] "RemoveContainer" containerID="1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.967850 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171"} err="failed to get container status \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": rpc error: code = NotFound desc = could not find container \"1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171\": container with ID starting with 1aadf309eb99a2cedba4cd2d2ceaf318e9f6c500624dd823ef281c3bbcbe0171 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.967874 4926 scope.go:117] "RemoveContainer" containerID="c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.968256 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2"} err="failed to get container status \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": rpc error: code = NotFound desc = could not find container \"c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2\": container with ID starting with c34ee146517b1408684a7c66b6ce58b98359e200020a8e36bd575e851053f8c2 not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.968283 4926 scope.go:117] "RemoveContainer" containerID="5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.968628 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe"} err="failed to get container status \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": rpc error: code = NotFound desc = could not find container \"5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe\": container with ID starting with 5722e101b3372b44b5f172d42ac38d0ddacf3abc51e97a696929af35b2600afe not found: ID does not exist" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.968668 4926 scope.go:117] "RemoveContainer" containerID="a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d" Mar 12 18:15:54 crc kubenswrapper[4926]: I0312 18:15:54.969034 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d"} err="failed to get container status \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": rpc error: code = NotFound desc = could not find container \"a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d\": container with ID starting with a6cd6dfb15f31ba734147337f04aacdd6c5480a9c23f974d37ab3b8e0d87f14d not found: ID does not exist" Mar 12 18:15:55 crc kubenswrapper[4926]: I0312 18:15:55.671256 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"e38b783fd4740c7da86f3b4c83e78bb8d8e06623f5e9efef89b65497cd4c3240"} Mar 12 18:15:55 crc kubenswrapper[4926]: I0312 18:15:55.671299 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"1a6b357d88744580e2fbeb41ce6b54b488057a60bc9995c8e9d3fb59a5afce9c"} Mar 12 18:15:55 crc kubenswrapper[4926]: I0312 18:15:55.671312 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"4beb46b32dc9802cd09149dd05a9d0029b8f0a7a397ba3a3dcd17c57630bd80d"} Mar 12 18:15:55 crc kubenswrapper[4926]: I0312 18:15:55.671321 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"cae0d92edf6eac9bf7d1bdf4b6f463b7a8036d5208b2af9b9e18546f51420e48"} Mar 12 18:15:55 crc kubenswrapper[4926]: I0312 18:15:55.671330 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"8585cd738fcdf6a8688c33a04792f8ba0d13f3fea2b2a2b94964b18cc5215665"} Mar 12 18:15:55 crc kubenswrapper[4926]: I0312 18:15:55.671338 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"ccfdccb75afb6065904425e06da5940863d90f7a64de166e8dc69927ef86a796"} Mar 12 18:15:56 crc kubenswrapper[4926]: I0312 18:15:56.502611 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc33af41-5aa0-4254-ac75-69433d5f4ce9" path="/var/lib/kubelet/pods/bc33af41-5aa0-4254-ac75-69433d5f4ce9/volumes" Mar 12 18:15:58 crc kubenswrapper[4926]: I0312 18:15:58.700243 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"281b710e00f80b4dbb02237d5d6febfa3c44d97f35dd3bbf53e415b68a8e6873"} Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.125812 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555656-zlrgw"] Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.126915 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.128776 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.129608 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.129717 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.207299 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8kp6\" (UniqueName: \"kubernetes.io/projected/b042fb81-959e-48c0-8a9e-87bafcad2fe3-kube-api-access-t8kp6\") pod \"auto-csr-approver-29555656-zlrgw\" (UID: \"b042fb81-959e-48c0-8a9e-87bafcad2fe3\") " pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.308489 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8kp6\" (UniqueName: \"kubernetes.io/projected/b042fb81-959e-48c0-8a9e-87bafcad2fe3-kube-api-access-t8kp6\") pod \"auto-csr-approver-29555656-zlrgw\" (UID: \"b042fb81-959e-48c0-8a9e-87bafcad2fe3\") " pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.328958 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8kp6\" (UniqueName: \"kubernetes.io/projected/b042fb81-959e-48c0-8a9e-87bafcad2fe3-kube-api-access-t8kp6\") pod \"auto-csr-approver-29555656-zlrgw\" (UID: \"b042fb81-959e-48c0-8a9e-87bafcad2fe3\") " pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.446640 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.464610 4926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(d0c1b896e388da9a50c7240b618174e003ef98ec177a69f69180c6a03b2e7c34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.464696 4926 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(d0c1b896e388da9a50c7240b618174e003ef98ec177a69f69180c6a03b2e7c34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.464716 4926 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(d0c1b896e388da9a50c7240b618174e003ef98ec177a69f69180c6a03b2e7c34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.464753 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29555656-zlrgw_openshift-infra(b042fb81-959e-48c0-8a9e-87bafcad2fe3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29555656-zlrgw_openshift-infra(b042fb81-959e-48c0-8a9e-87bafcad2fe3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(d0c1b896e388da9a50c7240b618174e003ef98ec177a69f69180c6a03b2e7c34): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.713602 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555656-zlrgw"] Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.724940 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.724960 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" event={"ID":"2054c0e9-f3e6-4d68-ab8f-57466f8b3e47","Type":"ContainerStarted","Data":"f4be59f72789562e0b01dc8bccac7a56d05ba2e07d619f4b205456956452b76d"} Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.725514 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.752432 4926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(9113d0c0d90bdc1bca38d13974c2f38873e294bc154dcca3f50372de298592d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.752531 4926 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(9113d0c0d90bdc1bca38d13974c2f38873e294bc154dcca3f50372de298592d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.752563 4926 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(9113d0c0d90bdc1bca38d13974c2f38873e294bc154dcca3f50372de298592d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:00 crc kubenswrapper[4926]: E0312 18:16:00.752629 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29555656-zlrgw_openshift-infra(b042fb81-959e-48c0-8a9e-87bafcad2fe3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29555656-zlrgw_openshift-infra(b042fb81-959e-48c0-8a9e-87bafcad2fe3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(9113d0c0d90bdc1bca38d13974c2f38873e294bc154dcca3f50372de298592d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" Mar 12 18:16:00 crc kubenswrapper[4926]: I0312 18:16:00.759509 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" podStartSLOduration=7.759485408 podStartE2EDuration="7.759485408s" podCreationTimestamp="2026-03-12 18:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:16:00.759353144 +0000 UTC m=+801.127979477" watchObservedRunningTime="2026-03-12 18:16:00.759485408 +0000 UTC m=+801.128111761" Mar 12 18:16:01 crc kubenswrapper[4926]: I0312 18:16:01.731861 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:16:01 crc kubenswrapper[4926]: I0312 18:16:01.732393 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:16:01 crc kubenswrapper[4926]: I0312 18:16:01.732417 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:16:01 crc kubenswrapper[4926]: I0312 18:16:01.775349 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:16:01 crc kubenswrapper[4926]: I0312 18:16:01.781628 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:16:04 crc kubenswrapper[4926]: I0312 18:16:04.490142 4926 scope.go:117] "RemoveContainer" containerID="2c510c831017eeb7aed88601040d4790cf6d5bebce7b09f246c92ea9b81e2481" Mar 12 18:16:04 crc kubenswrapper[4926]: E0312 18:16:04.490885 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xwqvl_openshift-multus(d5a53ef4-c701-457f-9cf2-85819bf04d1a)\"" pod="openshift-multus/multus-xwqvl" podUID="d5a53ef4-c701-457f-9cf2-85819bf04d1a" Mar 12 18:16:15 crc kubenswrapper[4926]: I0312 18:16:15.488970 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:15 crc kubenswrapper[4926]: I0312 18:16:15.490352 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:15 crc kubenswrapper[4926]: E0312 18:16:15.538040 4926 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(ff7b5c0c8b1aa68b40982f6422f49a4b9f491894d885b9649c85dff7ed33dcc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 12 18:16:15 crc kubenswrapper[4926]: E0312 18:16:15.538134 4926 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(ff7b5c0c8b1aa68b40982f6422f49a4b9f491894d885b9649c85dff7ed33dcc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:15 crc kubenswrapper[4926]: E0312 18:16:15.538172 4926 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(ff7b5c0c8b1aa68b40982f6422f49a4b9f491894d885b9649c85dff7ed33dcc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:15 crc kubenswrapper[4926]: E0312 18:16:15.538238 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29555656-zlrgw_openshift-infra(b042fb81-959e-48c0-8a9e-87bafcad2fe3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29555656-zlrgw_openshift-infra(b042fb81-959e-48c0-8a9e-87bafcad2fe3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29555656-zlrgw_openshift-infra_b042fb81-959e-48c0-8a9e-87bafcad2fe3_0(ff7b5c0c8b1aa68b40982f6422f49a4b9f491894d885b9649c85dff7ed33dcc6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" Mar 12 18:16:17 crc kubenswrapper[4926]: I0312 18:16:17.489951 4926 scope.go:117] "RemoveContainer" containerID="2c510c831017eeb7aed88601040d4790cf6d5bebce7b09f246c92ea9b81e2481" Mar 12 18:16:17 crc kubenswrapper[4926]: I0312 18:16:17.850611 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwqvl_d5a53ef4-c701-457f-9cf2-85819bf04d1a/kube-multus/2.log" Mar 12 18:16:17 crc kubenswrapper[4926]: I0312 18:16:17.850690 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwqvl" event={"ID":"d5a53ef4-c701-457f-9cf2-85819bf04d1a","Type":"ContainerStarted","Data":"1b5a9c1c4921fcd84b07bfff93da37ebd99a8d7476c2c239000be3cad6b4c354"} Mar 12 18:16:24 crc kubenswrapper[4926]: I0312 18:16:24.188298 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94d7p" Mar 12 18:16:28 crc kubenswrapper[4926]: I0312 18:16:28.489418 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:28 crc kubenswrapper[4926]: I0312 18:16:28.490650 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:28 crc kubenswrapper[4926]: I0312 18:16:28.766017 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555656-zlrgw"] Mar 12 18:16:28 crc kubenswrapper[4926]: I0312 18:16:28.935391 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" event={"ID":"b042fb81-959e-48c0-8a9e-87bafcad2fe3","Type":"ContainerStarted","Data":"3d3544bca4bb5b1538f3d1d62f8c4d4b5d1bdd2214c7ce446ae4aa62d0f37b88"} Mar 12 18:16:30 crc kubenswrapper[4926]: I0312 18:16:30.959289 4926 generic.go:334] "Generic (PLEG): container finished" podID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" containerID="d82acd09ec6775228cff799be28fec22c80e7a284a7159b9a2b0821199a8fa17" exitCode=0 Mar 12 18:16:30 crc kubenswrapper[4926]: I0312 18:16:30.959404 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" event={"ID":"b042fb81-959e-48c0-8a9e-87bafcad2fe3","Type":"ContainerDied","Data":"d82acd09ec6775228cff799be28fec22c80e7a284a7159b9a2b0821199a8fa17"} Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.246247 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.357005 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8kp6\" (UniqueName: \"kubernetes.io/projected/b042fb81-959e-48c0-8a9e-87bafcad2fe3-kube-api-access-t8kp6\") pod \"b042fb81-959e-48c0-8a9e-87bafcad2fe3\" (UID: \"b042fb81-959e-48c0-8a9e-87bafcad2fe3\") " Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.370695 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b042fb81-959e-48c0-8a9e-87bafcad2fe3-kube-api-access-t8kp6" (OuterVolumeSpecName: "kube-api-access-t8kp6") pod "b042fb81-959e-48c0-8a9e-87bafcad2fe3" (UID: "b042fb81-959e-48c0-8a9e-87bafcad2fe3"). InnerVolumeSpecName "kube-api-access-t8kp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.458511 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8kp6\" (UniqueName: \"kubernetes.io/projected/b042fb81-959e-48c0-8a9e-87bafcad2fe3-kube-api-access-t8kp6\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.969646 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" event={"ID":"b042fb81-959e-48c0-8a9e-87bafcad2fe3","Type":"ContainerDied","Data":"3d3544bca4bb5b1538f3d1d62f8c4d4b5d1bdd2214c7ce446ae4aa62d0f37b88"} Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.969902 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d3544bca4bb5b1538f3d1d62f8c4d4b5d1bdd2214c7ce446ae4aa62d0f37b88" Mar 12 18:16:32 crc kubenswrapper[4926]: I0312 18:16:32.969692 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555656-zlrgw" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.210115 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq"] Mar 12 18:16:33 crc kubenswrapper[4926]: E0312 18:16:33.210315 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" containerName="oc" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.210327 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" containerName="oc" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.210456 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" containerName="oc" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.211212 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.213065 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.222524 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq"] Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.291757 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555650-t22sq"] Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.295229 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555650-t22sq"] Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.375629 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.376197 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.376308 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97jtx\" (UniqueName: \"kubernetes.io/projected/062d1c31-cb0c-4470-bf74-0fb541319609-kube-api-access-97jtx\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.477900 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.478026 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.478072 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97jtx\" (UniqueName: \"kubernetes.io/projected/062d1c31-cb0c-4470-bf74-0fb541319609-kube-api-access-97jtx\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.478590 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.478946 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.503615 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97jtx\" (UniqueName: \"kubernetes.io/projected/062d1c31-cb0c-4470-bf74-0fb541319609-kube-api-access-97jtx\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:33 crc kubenswrapper[4926]: I0312 18:16:33.547144 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:34 crc kubenswrapper[4926]: I0312 18:16:34.008810 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq"] Mar 12 18:16:34 crc kubenswrapper[4926]: I0312 18:16:34.502935 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a5ff64f-2478-4592-8a08-fb47a40a8de5" path="/var/lib/kubelet/pods/5a5ff64f-2478-4592-8a08-fb47a40a8de5/volumes" Mar 12 18:16:34 crc kubenswrapper[4926]: I0312 18:16:34.985991 4926 generic.go:334] "Generic (PLEG): container finished" podID="062d1c31-cb0c-4470-bf74-0fb541319609" containerID="fc8115a8d2e23476b7d9888864d7a849fc584d14e0bc6160ffd2c7b9d5daad64" exitCode=0 Mar 12 18:16:34 crc kubenswrapper[4926]: I0312 18:16:34.986064 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" event={"ID":"062d1c31-cb0c-4470-bf74-0fb541319609","Type":"ContainerDied","Data":"fc8115a8d2e23476b7d9888864d7a849fc584d14e0bc6160ffd2c7b9d5daad64"} Mar 12 18:16:34 crc kubenswrapper[4926]: I0312 18:16:34.986111 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" event={"ID":"062d1c31-cb0c-4470-bf74-0fb541319609","Type":"ContainerStarted","Data":"3b98f38c5476d8f4dc24270676c0499d10c24a9bd4a1b327a146aba83d08aa26"} Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.311571 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h5l25"] Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.313614 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.324231 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5l25"] Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.505522 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575gn\" (UniqueName: \"kubernetes.io/projected/1ec7f260-07a4-4602-bba9-c08f20f8cfde-kube-api-access-575gn\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.505582 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-catalog-content\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.505618 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-utilities\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.606796 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-utilities\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.606967 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575gn\" (UniqueName: \"kubernetes.io/projected/1ec7f260-07a4-4602-bba9-c08f20f8cfde-kube-api-access-575gn\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.607028 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-catalog-content\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.607424 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-utilities\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.607599 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-catalog-content\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.626412 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575gn\" (UniqueName: \"kubernetes.io/projected/1ec7f260-07a4-4602-bba9-c08f20f8cfde-kube-api-access-575gn\") pod \"redhat-operators-h5l25\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.639193 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.839280 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5l25"] Mar 12 18:16:35 crc kubenswrapper[4926]: W0312 18:16:35.846803 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec7f260_07a4_4602_bba9_c08f20f8cfde.slice/crio-2a487c7409c3ac193f99d036af52bf58251bf6af395bf230d56d515dbbd21905 WatchSource:0}: Error finding container 2a487c7409c3ac193f99d036af52bf58251bf6af395bf230d56d515dbbd21905: Status 404 returned error can't find the container with id 2a487c7409c3ac193f99d036af52bf58251bf6af395bf230d56d515dbbd21905 Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.992372 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerStarted","Data":"77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6"} Mar 12 18:16:35 crc kubenswrapper[4926]: I0312 18:16:35.992410 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerStarted","Data":"2a487c7409c3ac193f99d036af52bf58251bf6af395bf230d56d515dbbd21905"} Mar 12 18:16:37 crc kubenswrapper[4926]: I0312 18:16:37.007853 4926 generic.go:334] "Generic (PLEG): container finished" podID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerID="77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6" exitCode=0 Mar 12 18:16:37 crc kubenswrapper[4926]: I0312 18:16:37.007980 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerDied","Data":"77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6"} Mar 12 18:16:37 crc kubenswrapper[4926]: I0312 18:16:37.810545 4926 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 18:16:38 crc kubenswrapper[4926]: I0312 18:16:38.017797 4926 generic.go:334] "Generic (PLEG): container finished" podID="062d1c31-cb0c-4470-bf74-0fb541319609" containerID="478e8bb351f2aa6703215d63de484eb3a3adde69425a8c06e70fa114ba1298de" exitCode=0 Mar 12 18:16:38 crc kubenswrapper[4926]: I0312 18:16:38.017903 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" event={"ID":"062d1c31-cb0c-4470-bf74-0fb541319609","Type":"ContainerDied","Data":"478e8bb351f2aa6703215d63de484eb3a3adde69425a8c06e70fa114ba1298de"} Mar 12 18:16:38 crc kubenswrapper[4926]: I0312 18:16:38.020629 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerStarted","Data":"74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320"} Mar 12 18:16:39 crc kubenswrapper[4926]: I0312 18:16:39.036725 4926 generic.go:334] "Generic (PLEG): container finished" podID="062d1c31-cb0c-4470-bf74-0fb541319609" containerID="02f4293db8ec40c24e25998ced5750ff8c11823ccd164966d00cb2b50fd60c19" exitCode=0 Mar 12 18:16:39 crc kubenswrapper[4926]: I0312 18:16:39.037252 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" event={"ID":"062d1c31-cb0c-4470-bf74-0fb541319609","Type":"ContainerDied","Data":"02f4293db8ec40c24e25998ced5750ff8c11823ccd164966d00cb2b50fd60c19"} Mar 12 18:16:39 crc kubenswrapper[4926]: I0312 18:16:39.047393 4926 generic.go:334] "Generic (PLEG): container finished" podID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerID="74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320" exitCode=0 Mar 12 18:16:39 crc kubenswrapper[4926]: I0312 18:16:39.047475 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerDied","Data":"74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320"} Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.058601 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerStarted","Data":"cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5"} Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.087839 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h5l25" podStartSLOduration=2.556300414 podStartE2EDuration="5.087816594s" podCreationTimestamp="2026-03-12 18:16:35 +0000 UTC" firstStartedPulling="2026-03-12 18:16:37.010360518 +0000 UTC m=+837.378986881" lastFinishedPulling="2026-03-12 18:16:39.541876718 +0000 UTC m=+839.910503061" observedRunningTime="2026-03-12 18:16:40.085050618 +0000 UTC m=+840.453676951" watchObservedRunningTime="2026-03-12 18:16:40.087816594 +0000 UTC m=+840.456442977" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.405713 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.482811 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-bundle\") pod \"062d1c31-cb0c-4470-bf74-0fb541319609\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.482874 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97jtx\" (UniqueName: \"kubernetes.io/projected/062d1c31-cb0c-4470-bf74-0fb541319609-kube-api-access-97jtx\") pod \"062d1c31-cb0c-4470-bf74-0fb541319609\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.483044 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-util\") pod \"062d1c31-cb0c-4470-bf74-0fb541319609\" (UID: \"062d1c31-cb0c-4470-bf74-0fb541319609\") " Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.484018 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-bundle" (OuterVolumeSpecName: "bundle") pod "062d1c31-cb0c-4470-bf74-0fb541319609" (UID: "062d1c31-cb0c-4470-bf74-0fb541319609"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.490609 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062d1c31-cb0c-4470-bf74-0fb541319609-kube-api-access-97jtx" (OuterVolumeSpecName: "kube-api-access-97jtx") pod "062d1c31-cb0c-4470-bf74-0fb541319609" (UID: "062d1c31-cb0c-4470-bf74-0fb541319609"). InnerVolumeSpecName "kube-api-access-97jtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.496905 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-util" (OuterVolumeSpecName: "util") pod "062d1c31-cb0c-4470-bf74-0fb541319609" (UID: "062d1c31-cb0c-4470-bf74-0fb541319609"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.584249 4926 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-util\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.584295 4926 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/062d1c31-cb0c-4470-bf74-0fb541319609-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:40 crc kubenswrapper[4926]: I0312 18:16:40.584314 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97jtx\" (UniqueName: \"kubernetes.io/projected/062d1c31-cb0c-4470-bf74-0fb541319609-kube-api-access-97jtx\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:41 crc kubenswrapper[4926]: I0312 18:16:41.069080 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" Mar 12 18:16:41 crc kubenswrapper[4926]: I0312 18:16:41.069267 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq" event={"ID":"062d1c31-cb0c-4470-bf74-0fb541319609","Type":"ContainerDied","Data":"3b98f38c5476d8f4dc24270676c0499d10c24a9bd4a1b327a146aba83d08aa26"} Mar 12 18:16:41 crc kubenswrapper[4926]: I0312 18:16:41.069808 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b98f38c5476d8f4dc24270676c0499d10c24a9bd4a1b327a146aba83d08aa26" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.601743 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv"] Mar 12 18:16:43 crc kubenswrapper[4926]: E0312 18:16:43.601992 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="pull" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.602006 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="pull" Mar 12 18:16:43 crc kubenswrapper[4926]: E0312 18:16:43.602028 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="util" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.602036 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="util" Mar 12 18:16:43 crc kubenswrapper[4926]: E0312 18:16:43.602049 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="extract" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.602059 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="extract" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.602184 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="062d1c31-cb0c-4470-bf74-0fb541319609" containerName="extract" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.602658 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.604400 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.604655 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.610010 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-695zc" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.611719 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv"] Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.624930 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c4gt\" (UniqueName: \"kubernetes.io/projected/7247f6a4-e62d-44e3-b91c-1117fca5c960-kube-api-access-9c4gt\") pod \"nmstate-operator-796d4cfff4-wnkdv\" (UID: \"7247f6a4-e62d-44e3-b91c-1117fca5c960\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.735380 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c4gt\" (UniqueName: \"kubernetes.io/projected/7247f6a4-e62d-44e3-b91c-1117fca5c960-kube-api-access-9c4gt\") pod \"nmstate-operator-796d4cfff4-wnkdv\" (UID: \"7247f6a4-e62d-44e3-b91c-1117fca5c960\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.753526 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c4gt\" (UniqueName: \"kubernetes.io/projected/7247f6a4-e62d-44e3-b91c-1117fca5c960-kube-api-access-9c4gt\") pod \"nmstate-operator-796d4cfff4-wnkdv\" (UID: \"7247f6a4-e62d-44e3-b91c-1117fca5c960\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" Mar 12 18:16:43 crc kubenswrapper[4926]: I0312 18:16:43.916989 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" Mar 12 18:16:44 crc kubenswrapper[4926]: I0312 18:16:44.169277 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv"] Mar 12 18:16:44 crc kubenswrapper[4926]: W0312 18:16:44.172495 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7247f6a4_e62d_44e3_b91c_1117fca5c960.slice/crio-5799508215f1be1b36b37688c4128bd4007438ec6aab48f2b9928063c5f1a50f WatchSource:0}: Error finding container 5799508215f1be1b36b37688c4128bd4007438ec6aab48f2b9928063c5f1a50f: Status 404 returned error can't find the container with id 5799508215f1be1b36b37688c4128bd4007438ec6aab48f2b9928063c5f1a50f Mar 12 18:16:45 crc kubenswrapper[4926]: I0312 18:16:45.098284 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" event={"ID":"7247f6a4-e62d-44e3-b91c-1117fca5c960","Type":"ContainerStarted","Data":"5799508215f1be1b36b37688c4128bd4007438ec6aab48f2b9928063c5f1a50f"} Mar 12 18:16:45 crc kubenswrapper[4926]: I0312 18:16:45.639883 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:45 crc kubenswrapper[4926]: I0312 18:16:45.640279 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:46 crc kubenswrapper[4926]: I0312 18:16:46.682863 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5l25" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="registry-server" probeResult="failure" output=< Mar 12 18:16:46 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:16:46 crc kubenswrapper[4926]: > Mar 12 18:16:47 crc kubenswrapper[4926]: I0312 18:16:47.115316 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" event={"ID":"7247f6a4-e62d-44e3-b91c-1117fca5c960","Type":"ContainerStarted","Data":"c53d52de6035183f9ebe6f6214ef9eec047645edbfcc2112636630f37a358302"} Mar 12 18:16:47 crc kubenswrapper[4926]: I0312 18:16:47.144670 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-wnkdv" podStartSLOduration=1.6531536610000002 podStartE2EDuration="4.144652096s" podCreationTimestamp="2026-03-12 18:16:43 +0000 UTC" firstStartedPulling="2026-03-12 18:16:44.174792058 +0000 UTC m=+844.543418411" lastFinishedPulling="2026-03-12 18:16:46.666290513 +0000 UTC m=+847.034916846" observedRunningTime="2026-03-12 18:16:47.141816728 +0000 UTC m=+847.510443071" watchObservedRunningTime="2026-03-12 18:16:47.144652096 +0000 UTC m=+847.513278429" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.197479 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.199479 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.202164 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9nqhs" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.213874 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.244275 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.247597 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.256312 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.271680 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-z52rq"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.272686 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.273553 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbcg\" (UniqueName: \"kubernetes.io/projected/cf97acff-2650-4e84-ab30-d10f9bd70ef4-kube-api-access-dkbcg\") pod \"nmstate-metrics-9b8c8685d-rn7dj\" (UID: \"cf97acff-2650-4e84-ab30-d10f9bd70ef4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.273614 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.273729 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kllkp\" (UniqueName: \"kubernetes.io/projected/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-kube-api-access-kllkp\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.301690 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.342691 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.343564 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.346829 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.347029 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ptr59" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.347208 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.350745 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375306 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375360 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-dbus-socket\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375386 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-nmstate-lock\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375466 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-ovs-socket\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375515 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kllkp\" (UniqueName: \"kubernetes.io/projected/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-kube-api-access-kllkp\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375554 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375592 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbcg\" (UniqueName: \"kubernetes.io/projected/cf97acff-2650-4e84-ab30-d10f9bd70ef4-kube-api-access-dkbcg\") pod \"nmstate-metrics-9b8c8685d-rn7dj\" (UID: \"cf97acff-2650-4e84-ab30-d10f9bd70ef4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375615 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg5fc\" (UniqueName: \"kubernetes.io/projected/9db224ca-f640-4756-9c80-afe7ff63dcbe-kube-api-access-hg5fc\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375634 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.375650 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2spkt\" (UniqueName: \"kubernetes.io/projected/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-kube-api-access-2spkt\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: E0312 18:16:53.375830 4926 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 12 18:16:53 crc kubenswrapper[4926]: E0312 18:16:53.375878 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-tls-key-pair podName:bdc3504f-bf1b-4b71-aaf2-45e24e41a84e nodeName:}" failed. No retries permitted until 2026-03-12 18:16:53.875862522 +0000 UTC m=+854.244488855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-tls-key-pair") pod "nmstate-webhook-5f558f5558-h5ll8" (UID: "bdc3504f-bf1b-4b71-aaf2-45e24e41a84e") : secret "openshift-nmstate-webhook" not found Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.394767 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kllkp\" (UniqueName: \"kubernetes.io/projected/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-kube-api-access-kllkp\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.396572 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbcg\" (UniqueName: \"kubernetes.io/projected/cf97acff-2650-4e84-ab30-d10f9bd70ef4-kube-api-access-dkbcg\") pod \"nmstate-metrics-9b8c8685d-rn7dj\" (UID: \"cf97acff-2650-4e84-ab30-d10f9bd70ef4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477091 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477593 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-dbus-socket\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: E0312 18:16:53.477301 4926 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477643 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-nmstate-lock\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477671 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-ovs-socket\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: E0312 18:16:53.477697 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-plugin-serving-cert podName:9e731d72-a0a8-46a4-af0b-5dce65f29dd1 nodeName:}" failed. No retries permitted until 2026-03-12 18:16:53.97767439 +0000 UTC m=+854.346300733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-67hvb" (UID: "9e731d72-a0a8-46a4-af0b-5dce65f29dd1") : secret "plugin-serving-cert" not found Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477714 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477787 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg5fc\" (UniqueName: \"kubernetes.io/projected/9db224ca-f640-4756-9c80-afe7ff63dcbe-kube-api-access-hg5fc\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477837 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2spkt\" (UniqueName: \"kubernetes.io/projected/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-kube-api-access-2spkt\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.477978 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-ovs-socket\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.478162 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-nmstate-lock\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.478266 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9db224ca-f640-4756-9c80-afe7ff63dcbe-dbus-socket\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.479047 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.497102 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg5fc\" (UniqueName: \"kubernetes.io/projected/9db224ca-f640-4756-9c80-afe7ff63dcbe-kube-api-access-hg5fc\") pod \"nmstate-handler-z52rq\" (UID: \"9db224ca-f640-4756-9c80-afe7ff63dcbe\") " pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.497177 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2spkt\" (UniqueName: \"kubernetes.io/projected/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-kube-api-access-2spkt\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.518235 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.525204 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dd74c7c8d-h748s"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.526063 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.577507 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd74c7c8d-h748s"] Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.578780 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-trusted-ca-bundle\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.578841 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-oauth-serving-cert\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.578870 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-console-config\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.578892 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-service-ca\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.578930 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2872acc-add0-4247-b9de-29182d576add-console-serving-cert\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.578980 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fkz\" (UniqueName: \"kubernetes.io/projected/a2872acc-add0-4247-b9de-29182d576add-kube-api-access-99fkz\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.579048 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2872acc-add0-4247-b9de-29182d576add-console-oauth-config\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.602049 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:53 crc kubenswrapper[4926]: W0312 18:16:53.621687 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db224ca_f640_4756_9c80_afe7ff63dcbe.slice/crio-af86013ef6d8eabf4ac321f53ed9fbc379671a4d43fa8c4ddfa2f271033910d1 WatchSource:0}: Error finding container af86013ef6d8eabf4ac321f53ed9fbc379671a4d43fa8c4ddfa2f271033910d1: Status 404 returned error can't find the container with id af86013ef6d8eabf4ac321f53ed9fbc379671a4d43fa8c4ddfa2f271033910d1 Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680433 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-oauth-serving-cert\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680733 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-console-config\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680756 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-service-ca\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680780 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2872acc-add0-4247-b9de-29182d576add-console-serving-cert\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680825 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fkz\" (UniqueName: \"kubernetes.io/projected/a2872acc-add0-4247-b9de-29182d576add-kube-api-access-99fkz\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680876 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2872acc-add0-4247-b9de-29182d576add-console-oauth-config\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.680933 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-trusted-ca-bundle\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.682320 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-trusted-ca-bundle\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.682576 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-oauth-serving-cert\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.683173 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-console-config\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.683768 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2872acc-add0-4247-b9de-29182d576add-service-ca\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.687168 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2872acc-add0-4247-b9de-29182d576add-console-oauth-config\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.687546 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2872acc-add0-4247-b9de-29182d576add-console-serving-cert\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.714230 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fkz\" (UniqueName: \"kubernetes.io/projected/a2872acc-add0-4247-b9de-29182d576add-kube-api-access-99fkz\") pod \"console-5dd74c7c8d-h748s\" (UID: \"a2872acc-add0-4247-b9de-29182d576add\") " pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.867208 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.882645 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.886833 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bdc3504f-bf1b-4b71-aaf2-45e24e41a84e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-h5ll8\" (UID: \"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.985047 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:53 crc kubenswrapper[4926]: I0312 18:16:53.990993 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e731d72-a0a8-46a4-af0b-5dce65f29dd1-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-67hvb\" (UID: \"9e731d72-a0a8-46a4-af0b-5dce65f29dd1\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.018956 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj"] Mar 12 18:16:54 crc kubenswrapper[4926]: W0312 18:16:54.085948 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2872acc_add0_4247_b9de_29182d576add.slice/crio-0dd02ac57b1182a64b9e652af6125f6d4b80a1fa0283d2cf0121b614df75be64 WatchSource:0}: Error finding container 0dd02ac57b1182a64b9e652af6125f6d4b80a1fa0283d2cf0121b614df75be64: Status 404 returned error can't find the container with id 0dd02ac57b1182a64b9e652af6125f6d4b80a1fa0283d2cf0121b614df75be64 Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.085967 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd74c7c8d-h748s"] Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.162267 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd74c7c8d-h748s" event={"ID":"a2872acc-add0-4247-b9de-29182d576add","Type":"ContainerStarted","Data":"0dd02ac57b1182a64b9e652af6125f6d4b80a1fa0283d2cf0121b614df75be64"} Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.163720 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z52rq" event={"ID":"9db224ca-f640-4756-9c80-afe7ff63dcbe","Type":"ContainerStarted","Data":"af86013ef6d8eabf4ac321f53ed9fbc379671a4d43fa8c4ddfa2f271033910d1"} Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.165328 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" event={"ID":"cf97acff-2650-4e84-ab30-d10f9bd70ef4","Type":"ContainerStarted","Data":"d4fa75e5be6ba0fe4bfed768e68798f6fff5cf6e00898b5ea44fe66e500455d4"} Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.176911 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.258073 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.474180 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb"] Mar 12 18:16:54 crc kubenswrapper[4926]: W0312 18:16:54.480582 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e731d72_a0a8_46a4_af0b_5dce65f29dd1.slice/crio-f147a0ab76c9688735ad96ef60ec96bc8652f2bfbfd1da41905f0fa1163a475f WatchSource:0}: Error finding container f147a0ab76c9688735ad96ef60ec96bc8652f2bfbfd1da41905f0fa1163a475f: Status 404 returned error can't find the container with id f147a0ab76c9688735ad96ef60ec96bc8652f2bfbfd1da41905f0fa1163a475f Mar 12 18:16:54 crc kubenswrapper[4926]: I0312 18:16:54.594343 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8"] Mar 12 18:16:54 crc kubenswrapper[4926]: W0312 18:16:54.603760 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc3504f_bf1b_4b71_aaf2_45e24e41a84e.slice/crio-10294b79b92d5b6599d2e7bdd3189cfd8c9071c3e5890eaf69101a049a18c76d WatchSource:0}: Error finding container 10294b79b92d5b6599d2e7bdd3189cfd8c9071c3e5890eaf69101a049a18c76d: Status 404 returned error can't find the container with id 10294b79b92d5b6599d2e7bdd3189cfd8c9071c3e5890eaf69101a049a18c76d Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.174054 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd74c7c8d-h748s" event={"ID":"a2872acc-add0-4247-b9de-29182d576add","Type":"ContainerStarted","Data":"409ad4780a657ca30f7de6f8e1fd1934f3400a8d18525ae3304ebe2a7c68ca46"} Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.175178 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" event={"ID":"9e731d72-a0a8-46a4-af0b-5dce65f29dd1","Type":"ContainerStarted","Data":"f147a0ab76c9688735ad96ef60ec96bc8652f2bfbfd1da41905f0fa1163a475f"} Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.177113 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" event={"ID":"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e","Type":"ContainerStarted","Data":"10294b79b92d5b6599d2e7bdd3189cfd8c9071c3e5890eaf69101a049a18c76d"} Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.198131 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dd74c7c8d-h748s" podStartSLOduration=2.198113346 podStartE2EDuration="2.198113346s" podCreationTimestamp="2026-03-12 18:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:16:55.193484002 +0000 UTC m=+855.562110335" watchObservedRunningTime="2026-03-12 18:16:55.198113346 +0000 UTC m=+855.566739689" Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.692456 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.740405 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:55 crc kubenswrapper[4926]: I0312 18:16:55.934588 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h5l25"] Mar 12 18:16:56 crc kubenswrapper[4926]: I0312 18:16:56.817306 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:16:56 crc kubenswrapper[4926]: I0312 18:16:56.817364 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.190976 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" event={"ID":"bdc3504f-bf1b-4b71-aaf2-45e24e41a84e","Type":"ContainerStarted","Data":"f9ad9d26f11464ab92e6a67ee91b0719f50bb451d1c1460ba378a94a95476530"} Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.191097 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.192580 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-z52rq" event={"ID":"9db224ca-f640-4756-9c80-afe7ff63dcbe","Type":"ContainerStarted","Data":"436dde831c9f9d785be41544e9dacc6364c03710a7397c86cc5b9441ff5b331f"} Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.192831 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.194481 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" event={"ID":"cf97acff-2650-4e84-ab30-d10f9bd70ef4","Type":"ContainerStarted","Data":"ec1620f8d6850b5716c49100add4fec7a885301893db061d8983844b50be4a57"} Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.194625 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h5l25" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="registry-server" containerID="cri-o://cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5" gracePeriod=2 Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.213361 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" podStartSLOduration=2.521425055 podStartE2EDuration="4.213340885s" podCreationTimestamp="2026-03-12 18:16:53 +0000 UTC" firstStartedPulling="2026-03-12 18:16:54.605204869 +0000 UTC m=+854.973831202" lastFinishedPulling="2026-03-12 18:16:56.297120699 +0000 UTC m=+856.665747032" observedRunningTime="2026-03-12 18:16:57.205884302 +0000 UTC m=+857.574510635" watchObservedRunningTime="2026-03-12 18:16:57.213340885 +0000 UTC m=+857.581967218" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.227252 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-z52rq" podStartSLOduration=1.554460382 podStartE2EDuration="4.227235767s" podCreationTimestamp="2026-03-12 18:16:53 +0000 UTC" firstStartedPulling="2026-03-12 18:16:53.623369683 +0000 UTC m=+853.991996026" lastFinishedPulling="2026-03-12 18:16:56.296145068 +0000 UTC m=+856.664771411" observedRunningTime="2026-03-12 18:16:57.225613856 +0000 UTC m=+857.594240189" watchObservedRunningTime="2026-03-12 18:16:57.227235767 +0000 UTC m=+857.595862100" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.525990 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.535746 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-catalog-content\") pod \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.535936 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-575gn\" (UniqueName: \"kubernetes.io/projected/1ec7f260-07a4-4602-bba9-c08f20f8cfde-kube-api-access-575gn\") pod \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.536039 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-utilities\") pod \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\" (UID: \"1ec7f260-07a4-4602-bba9-c08f20f8cfde\") " Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.540109 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-utilities" (OuterVolumeSpecName: "utilities") pod "1ec7f260-07a4-4602-bba9-c08f20f8cfde" (UID: "1ec7f260-07a4-4602-bba9-c08f20f8cfde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.547981 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec7f260-07a4-4602-bba9-c08f20f8cfde-kube-api-access-575gn" (OuterVolumeSpecName: "kube-api-access-575gn") pod "1ec7f260-07a4-4602-bba9-c08f20f8cfde" (UID: "1ec7f260-07a4-4602-bba9-c08f20f8cfde"). InnerVolumeSpecName "kube-api-access-575gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.638025 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-575gn\" (UniqueName: \"kubernetes.io/projected/1ec7f260-07a4-4602-bba9-c08f20f8cfde-kube-api-access-575gn\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.638052 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.692228 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ec7f260-07a4-4602-bba9-c08f20f8cfde" (UID: "1ec7f260-07a4-4602-bba9-c08f20f8cfde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:16:57 crc kubenswrapper[4926]: I0312 18:16:57.739526 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ec7f260-07a4-4602-bba9-c08f20f8cfde-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.208088 4926 generic.go:334] "Generic (PLEG): container finished" podID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerID="cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5" exitCode=0 Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.208237 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerDied","Data":"cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5"} Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.208280 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5l25" event={"ID":"1ec7f260-07a4-4602-bba9-c08f20f8cfde","Type":"ContainerDied","Data":"2a487c7409c3ac193f99d036af52bf58251bf6af395bf230d56d515dbbd21905"} Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.208309 4926 scope.go:117] "RemoveContainer" containerID="cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.208533 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5l25" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.213307 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" event={"ID":"9e731d72-a0a8-46a4-af0b-5dce65f29dd1","Type":"ContainerStarted","Data":"993d2423c1bbc3b0f46463a6f134db5ee4e6d76062f2c251b25aaf8c7391ad8f"} Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.234899 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-67hvb" podStartSLOduration=2.495741106 podStartE2EDuration="5.234874076s" podCreationTimestamp="2026-03-12 18:16:53 +0000 UTC" firstStartedPulling="2026-03-12 18:16:54.482012537 +0000 UTC m=+854.850638890" lastFinishedPulling="2026-03-12 18:16:57.221145527 +0000 UTC m=+857.589771860" observedRunningTime="2026-03-12 18:16:58.227060373 +0000 UTC m=+858.595686716" watchObservedRunningTime="2026-03-12 18:16:58.234874076 +0000 UTC m=+858.603500419" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.238682 4926 scope.go:117] "RemoveContainer" containerID="74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.264619 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h5l25"] Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.268133 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h5l25"] Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.271562 4926 scope.go:117] "RemoveContainer" containerID="77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.295164 4926 scope.go:117] "RemoveContainer" containerID="cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5" Mar 12 18:16:58 crc kubenswrapper[4926]: E0312 18:16:58.295734 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5\": container with ID starting with cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5 not found: ID does not exist" containerID="cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.295766 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5"} err="failed to get container status \"cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5\": rpc error: code = NotFound desc = could not find container \"cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5\": container with ID starting with cde59c82a4f343b3652464d85c7c4f6f5de5b3c10538053ac4d1fbf861b525c5 not found: ID does not exist" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.295793 4926 scope.go:117] "RemoveContainer" containerID="74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320" Mar 12 18:16:58 crc kubenswrapper[4926]: E0312 18:16:58.296228 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320\": container with ID starting with 74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320 not found: ID does not exist" containerID="74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.296272 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320"} err="failed to get container status \"74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320\": rpc error: code = NotFound desc = could not find container \"74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320\": container with ID starting with 74c5cf37effbecfaa9e3318f6a84edaeefe25fa2f0dec8231b3281e40bc9f320 not found: ID does not exist" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.296302 4926 scope.go:117] "RemoveContainer" containerID="77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6" Mar 12 18:16:58 crc kubenswrapper[4926]: E0312 18:16:58.296705 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6\": container with ID starting with 77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6 not found: ID does not exist" containerID="77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.296734 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6"} err="failed to get container status \"77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6\": rpc error: code = NotFound desc = could not find container \"77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6\": container with ID starting with 77185795828a5381b76fc3f332b62ff134a55c6dbcf3c11bcc7dd2885f8b72b6 not found: ID does not exist" Mar 12 18:16:58 crc kubenswrapper[4926]: I0312 18:16:58.498820 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" path="/var/lib/kubelet/pods/1ec7f260-07a4-4602-bba9-c08f20f8cfde/volumes" Mar 12 18:16:59 crc kubenswrapper[4926]: I0312 18:16:59.224582 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" event={"ID":"cf97acff-2650-4e84-ab30-d10f9bd70ef4","Type":"ContainerStarted","Data":"2059feca55eaa3385483d3f33e15ac1922628218eb464dd31a38e38a262e1cfa"} Mar 12 18:17:01 crc kubenswrapper[4926]: I0312 18:17:01.125302 4926 scope.go:117] "RemoveContainer" containerID="1d8805e76417daef23d168ee9dafcbdf012b728e6e9278f288005edef5f23d20" Mar 12 18:17:03 crc kubenswrapper[4926]: I0312 18:17:03.631215 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-z52rq" Mar 12 18:17:03 crc kubenswrapper[4926]: I0312 18:17:03.653529 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-rn7dj" podStartSLOduration=5.641639991 podStartE2EDuration="10.653510991s" podCreationTimestamp="2026-03-12 18:16:53 +0000 UTC" firstStartedPulling="2026-03-12 18:16:54.018646511 +0000 UTC m=+854.387272884" lastFinishedPulling="2026-03-12 18:16:59.030517541 +0000 UTC m=+859.399143884" observedRunningTime="2026-03-12 18:16:59.247090858 +0000 UTC m=+859.615717241" watchObservedRunningTime="2026-03-12 18:17:03.653510991 +0000 UTC m=+864.022137334" Mar 12 18:17:03 crc kubenswrapper[4926]: I0312 18:17:03.867684 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:17:03 crc kubenswrapper[4926]: I0312 18:17:03.867782 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:17:03 crc kubenswrapper[4926]: I0312 18:17:03.876056 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:17:04 crc kubenswrapper[4926]: I0312 18:17:04.270684 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dd74c7c8d-h748s" Mar 12 18:17:04 crc kubenswrapper[4926]: I0312 18:17:04.323256 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vb9qx"] Mar 12 18:17:14 crc kubenswrapper[4926]: I0312 18:17:14.186512 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-h5ll8" Mar 12 18:17:26 crc kubenswrapper[4926]: I0312 18:17:26.817693 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:17:26 crc kubenswrapper[4926]: I0312 18:17:26.818643 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.194739 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg"] Mar 12 18:17:29 crc kubenswrapper[4926]: E0312 18:17:29.197968 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="extract-utilities" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.197994 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="extract-utilities" Mar 12 18:17:29 crc kubenswrapper[4926]: E0312 18:17:29.198005 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="extract-content" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.198014 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="extract-content" Mar 12 18:17:29 crc kubenswrapper[4926]: E0312 18:17:29.198045 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="registry-server" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.198055 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="registry-server" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.198201 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec7f260-07a4-4602-bba9-c08f20f8cfde" containerName="registry-server" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.199204 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.199355 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg"] Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.200902 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.278897 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.278956 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq5cp\" (UniqueName: \"kubernetes.io/projected/a46be1f3-50fc-45b5-a480-98d9763db69d-kube-api-access-lq5cp\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.279028 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.366266 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-vb9qx" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerName="console" containerID="cri-o://32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b" gracePeriod=15 Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.380770 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.380895 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.381420 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.381383 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.381429 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq5cp\" (UniqueName: \"kubernetes.io/projected/a46be1f3-50fc-45b5-a480-98d9763db69d-kube-api-access-lq5cp\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.400388 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq5cp\" (UniqueName: \"kubernetes.io/projected/a46be1f3-50fc-45b5-a480-98d9763db69d-kube-api-access-lq5cp\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.517906 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.695088 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vb9qx_270031fa-3d83-4edf-bb5d-19ce9e1a693d/console/0.log" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.695351 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.702806 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg"] Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788289 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t46vq\" (UniqueName: \"kubernetes.io/projected/270031fa-3d83-4edf-bb5d-19ce9e1a693d-kube-api-access-t46vq\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788387 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-trusted-ca-bundle\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788525 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-config\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788573 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-oauth-config\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788634 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-serving-cert\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788694 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-oauth-serving-cert\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.788758 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-service-ca\") pod \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\" (UID: \"270031fa-3d83-4edf-bb5d-19ce9e1a693d\") " Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.789482 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-config" (OuterVolumeSpecName: "console-config") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.790026 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.790179 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-service-ca" (OuterVolumeSpecName: "service-ca") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.790218 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.795475 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.795989 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.797596 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270031fa-3d83-4edf-bb5d-19ce9e1a693d-kube-api-access-t46vq" (OuterVolumeSpecName: "kube-api-access-t46vq") pod "270031fa-3d83-4edf-bb5d-19ce9e1a693d" (UID: "270031fa-3d83-4edf-bb5d-19ce9e1a693d"). InnerVolumeSpecName "kube-api-access-t46vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890544 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t46vq\" (UniqueName: \"kubernetes.io/projected/270031fa-3d83-4edf-bb5d-19ce9e1a693d-kube-api-access-t46vq\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890571 4926 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890580 4926 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890588 4926 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890597 4926 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/270031fa-3d83-4edf-bb5d-19ce9e1a693d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890606 4926 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:29 crc kubenswrapper[4926]: I0312 18:17:29.890614 4926 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/270031fa-3d83-4edf-bb5d-19ce9e1a693d-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.436700 4926 generic.go:334] "Generic (PLEG): container finished" podID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerID="990a6bbc9787734ec00d8b44f262ac34b09786254030da04e8d02b2392609120" exitCode=0 Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.436789 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" event={"ID":"a46be1f3-50fc-45b5-a480-98d9763db69d","Type":"ContainerDied","Data":"990a6bbc9787734ec00d8b44f262ac34b09786254030da04e8d02b2392609120"} Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.437049 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" event={"ID":"a46be1f3-50fc-45b5-a480-98d9763db69d","Type":"ContainerStarted","Data":"2f45bee694048e411034283204782edde4d63f3b997a31a54e77437bd55ef983"} Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.440519 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-vb9qx_270031fa-3d83-4edf-bb5d-19ce9e1a693d/console/0.log" Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.440617 4926 generic.go:334] "Generic (PLEG): container finished" podID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerID="32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b" exitCode=2 Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.440665 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vb9qx" event={"ID":"270031fa-3d83-4edf-bb5d-19ce9e1a693d","Type":"ContainerDied","Data":"32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b"} Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.440704 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-vb9qx" event={"ID":"270031fa-3d83-4edf-bb5d-19ce9e1a693d","Type":"ContainerDied","Data":"8b8722347bac291c780b1fdd439b9d8743876830528929d7bd120c0953d14111"} Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.440733 4926 scope.go:117] "RemoveContainer" containerID="32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b" Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.440921 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-vb9qx" Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.441026 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.500283 4926 scope.go:117] "RemoveContainer" containerID="32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b" Mar 12 18:17:30 crc kubenswrapper[4926]: E0312 18:17:30.501007 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b\": container with ID starting with 32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b not found: ID does not exist" containerID="32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b" Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.501061 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b"} err="failed to get container status \"32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b\": rpc error: code = NotFound desc = could not find container \"32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b\": container with ID starting with 32bb48f2aa4bede9bbf0c7b620e2f860f26e2deae6fde3c60c9a415822d1b25b not found: ID does not exist" Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.510914 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-vb9qx"] Mar 12 18:17:30 crc kubenswrapper[4926]: I0312 18:17:30.510971 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-vb9qx"] Mar 12 18:17:32 crc kubenswrapper[4926]: I0312 18:17:32.590462 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" path="/var/lib/kubelet/pods/270031fa-3d83-4edf-bb5d-19ce9e1a693d/volumes" Mar 12 18:17:32 crc kubenswrapper[4926]: I0312 18:17:32.592196 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" event={"ID":"a46be1f3-50fc-45b5-a480-98d9763db69d","Type":"ContainerStarted","Data":"ccb2939e93467122e809759af18c473c961879e45f9a02cfc6a36f1cb08e16c7"} Mar 12 18:17:33 crc kubenswrapper[4926]: I0312 18:17:33.605752 4926 generic.go:334] "Generic (PLEG): container finished" podID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerID="ccb2939e93467122e809759af18c473c961879e45f9a02cfc6a36f1cb08e16c7" exitCode=0 Mar 12 18:17:33 crc kubenswrapper[4926]: I0312 18:17:33.605833 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" event={"ID":"a46be1f3-50fc-45b5-a480-98d9763db69d","Type":"ContainerDied","Data":"ccb2939e93467122e809759af18c473c961879e45f9a02cfc6a36f1cb08e16c7"} Mar 12 18:17:34 crc kubenswrapper[4926]: I0312 18:17:34.616329 4926 generic.go:334] "Generic (PLEG): container finished" podID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerID="2c66fe18bef8f2684627fbbbebbb90bceda864ef27b9acf48fed31542186e3f5" exitCode=0 Mar 12 18:17:34 crc kubenswrapper[4926]: I0312 18:17:34.616581 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" event={"ID":"a46be1f3-50fc-45b5-a480-98d9763db69d","Type":"ContainerDied","Data":"2c66fe18bef8f2684627fbbbebbb90bceda864ef27b9acf48fed31542186e3f5"} Mar 12 18:17:35 crc kubenswrapper[4926]: I0312 18:17:35.933528 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:35 crc kubenswrapper[4926]: I0312 18:17:35.974313 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-util\") pod \"a46be1f3-50fc-45b5-a480-98d9763db69d\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " Mar 12 18:17:35 crc kubenswrapper[4926]: I0312 18:17:35.974645 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq5cp\" (UniqueName: \"kubernetes.io/projected/a46be1f3-50fc-45b5-a480-98d9763db69d-kube-api-access-lq5cp\") pod \"a46be1f3-50fc-45b5-a480-98d9763db69d\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " Mar 12 18:17:35 crc kubenswrapper[4926]: I0312 18:17:35.974737 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-bundle\") pod \"a46be1f3-50fc-45b5-a480-98d9763db69d\" (UID: \"a46be1f3-50fc-45b5-a480-98d9763db69d\") " Mar 12 18:17:35 crc kubenswrapper[4926]: I0312 18:17:35.976645 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-bundle" (OuterVolumeSpecName: "bundle") pod "a46be1f3-50fc-45b5-a480-98d9763db69d" (UID: "a46be1f3-50fc-45b5-a480-98d9763db69d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:17:35 crc kubenswrapper[4926]: I0312 18:17:35.980878 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a46be1f3-50fc-45b5-a480-98d9763db69d-kube-api-access-lq5cp" (OuterVolumeSpecName: "kube-api-access-lq5cp") pod "a46be1f3-50fc-45b5-a480-98d9763db69d" (UID: "a46be1f3-50fc-45b5-a480-98d9763db69d"). InnerVolumeSpecName "kube-api-access-lq5cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.000122 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-util" (OuterVolumeSpecName: "util") pod "a46be1f3-50fc-45b5-a480-98d9763db69d" (UID: "a46be1f3-50fc-45b5-a480-98d9763db69d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.076657 4926 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.076714 4926 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a46be1f3-50fc-45b5-a480-98d9763db69d-util\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.076726 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq5cp\" (UniqueName: \"kubernetes.io/projected/a46be1f3-50fc-45b5-a480-98d9763db69d-kube-api-access-lq5cp\") on node \"crc\" DevicePath \"\"" Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.635646 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" event={"ID":"a46be1f3-50fc-45b5-a480-98d9763db69d","Type":"ContainerDied","Data":"2f45bee694048e411034283204782edde4d63f3b997a31a54e77437bd55ef983"} Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.635682 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f45bee694048e411034283204782edde4d63f3b997a31a54e77437bd55ef983" Mar 12 18:17:36 crc kubenswrapper[4926]: I0312 18:17:36.635788 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.638257 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8"] Mar 12 18:17:44 crc kubenswrapper[4926]: E0312 18:17:44.639093 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerName="console" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639108 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerName="console" Mar 12 18:17:44 crc kubenswrapper[4926]: E0312 18:17:44.639120 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="pull" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639127 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="pull" Mar 12 18:17:44 crc kubenswrapper[4926]: E0312 18:17:44.639136 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="extract" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639145 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="extract" Mar 12 18:17:44 crc kubenswrapper[4926]: E0312 18:17:44.639158 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="util" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639165 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="util" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639281 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="270031fa-3d83-4edf-bb5d-19ce9e1a693d" containerName="console" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639294 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a46be1f3-50fc-45b5-a480-98d9763db69d" containerName="extract" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.639816 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.642100 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.642228 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.642457 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9rnfq" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.643255 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.643792 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.702359 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a847a81-61ae-42e3-9866-c25b68fd77cb-webhook-cert\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.702452 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfj4m\" (UniqueName: \"kubernetes.io/projected/8a847a81-61ae-42e3-9866-c25b68fd77cb-kube-api-access-jfj4m\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.702503 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a847a81-61ae-42e3-9866-c25b68fd77cb-apiservice-cert\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.705122 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8"] Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.803889 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a847a81-61ae-42e3-9866-c25b68fd77cb-webhook-cert\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.804279 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfj4m\" (UniqueName: \"kubernetes.io/projected/8a847a81-61ae-42e3-9866-c25b68fd77cb-kube-api-access-jfj4m\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.804332 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a847a81-61ae-42e3-9866-c25b68fd77cb-apiservice-cert\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.814933 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8a847a81-61ae-42e3-9866-c25b68fd77cb-apiservice-cert\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.824277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8a847a81-61ae-42e3-9866-c25b68fd77cb-webhook-cert\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.828896 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfj4m\" (UniqueName: \"kubernetes.io/projected/8a847a81-61ae-42e3-9866-c25b68fd77cb-kube-api-access-jfj4m\") pod \"metallb-operator-controller-manager-7cf578c5b8-z4gn8\" (UID: \"8a847a81-61ae-42e3-9866-c25b68fd77cb\") " pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.959175 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.967173 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58"] Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.967906 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.970160 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.970317 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 18:17:44 crc kubenswrapper[4926]: I0312 18:17:44.970548 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gjjw5" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.020991 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mmds\" (UniqueName: \"kubernetes.io/projected/4d6f4326-2022-436b-9523-383aae3fd5cd-kube-api-access-6mmds\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.021086 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d6f4326-2022-436b-9523-383aae3fd5cd-apiservice-cert\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.021149 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d6f4326-2022-436b-9523-383aae3fd5cd-webhook-cert\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.061118 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58"] Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.122211 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mmds\" (UniqueName: \"kubernetes.io/projected/4d6f4326-2022-436b-9523-383aae3fd5cd-kube-api-access-6mmds\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.122297 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d6f4326-2022-436b-9523-383aae3fd5cd-apiservice-cert\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.122337 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d6f4326-2022-436b-9523-383aae3fd5cd-webhook-cert\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.131719 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d6f4326-2022-436b-9523-383aae3fd5cd-webhook-cert\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.137356 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d6f4326-2022-436b-9523-383aae3fd5cd-apiservice-cert\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.162040 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mmds\" (UniqueName: \"kubernetes.io/projected/4d6f4326-2022-436b-9523-383aae3fd5cd-kube-api-access-6mmds\") pod \"metallb-operator-webhook-server-79dbc878dc-w7f58\" (UID: \"4d6f4326-2022-436b-9523-383aae3fd5cd\") " pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.235625 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8"] Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.343711 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.637265 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58"] Mar 12 18:17:45 crc kubenswrapper[4926]: W0312 18:17:45.644299 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6f4326_2022_436b_9523_383aae3fd5cd.slice/crio-61f6d9cd37848b3546c922f5be12b1797c041f0cdc0df201f6726cd850cb1afd WatchSource:0}: Error finding container 61f6d9cd37848b3546c922f5be12b1797c041f0cdc0df201f6726cd850cb1afd: Status 404 returned error can't find the container with id 61f6d9cd37848b3546c922f5be12b1797c041f0cdc0df201f6726cd850cb1afd Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.685593 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" event={"ID":"8a847a81-61ae-42e3-9866-c25b68fd77cb","Type":"ContainerStarted","Data":"9beb4ee77df9449d491ad0358f61dadbb88c3ea4e7c5232740b2ec2dc5613459"} Mar 12 18:17:45 crc kubenswrapper[4926]: I0312 18:17:45.687088 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" event={"ID":"4d6f4326-2022-436b-9523-383aae3fd5cd","Type":"ContainerStarted","Data":"61f6d9cd37848b3546c922f5be12b1797c041f0cdc0df201f6726cd850cb1afd"} Mar 12 18:17:51 crc kubenswrapper[4926]: I0312 18:17:51.724145 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" event={"ID":"8a847a81-61ae-42e3-9866-c25b68fd77cb","Type":"ContainerStarted","Data":"c44c4510181cfa327e64fc978bdcabc5a11fee48eceb0d971b89a6a7b1e29bcc"} Mar 12 18:17:51 crc kubenswrapper[4926]: I0312 18:17:51.725960 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:17:51 crc kubenswrapper[4926]: I0312 18:17:51.728227 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" event={"ID":"4d6f4326-2022-436b-9523-383aae3fd5cd","Type":"ContainerStarted","Data":"e591d599c1b024d8b866236eb0fb8a1794e0bfc252c1693353dac27a1e0a2644"} Mar 12 18:17:51 crc kubenswrapper[4926]: I0312 18:17:51.728360 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:17:51 crc kubenswrapper[4926]: I0312 18:17:51.785100 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" podStartSLOduration=1.9878568699999999 podStartE2EDuration="7.785071173s" podCreationTimestamp="2026-03-12 18:17:44 +0000 UTC" firstStartedPulling="2026-03-12 18:17:45.245625298 +0000 UTC m=+905.614251631" lastFinishedPulling="2026-03-12 18:17:51.042839601 +0000 UTC m=+911.411465934" observedRunningTime="2026-03-12 18:17:51.771932534 +0000 UTC m=+912.140558877" watchObservedRunningTime="2026-03-12 18:17:51.785071173 +0000 UTC m=+912.153697516" Mar 12 18:17:51 crc kubenswrapper[4926]: I0312 18:17:51.821229 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" podStartSLOduration=2.396985769 podStartE2EDuration="7.821212458s" podCreationTimestamp="2026-03-12 18:17:44 +0000 UTC" firstStartedPulling="2026-03-12 18:17:45.647224432 +0000 UTC m=+906.015850755" lastFinishedPulling="2026-03-12 18:17:51.071451111 +0000 UTC m=+911.440077444" observedRunningTime="2026-03-12 18:17:51.820286749 +0000 UTC m=+912.188913122" watchObservedRunningTime="2026-03-12 18:17:51.821212458 +0000 UTC m=+912.189838791" Mar 12 18:17:56 crc kubenswrapper[4926]: I0312 18:17:56.817235 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:17:56 crc kubenswrapper[4926]: I0312 18:17:56.817774 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:17:56 crc kubenswrapper[4926]: I0312 18:17:56.817818 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:17:56 crc kubenswrapper[4926]: I0312 18:17:56.818325 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a397bef079b1410b3294983dad25ada9109b1a0eac364c78c0ff4aeeccdf38ed"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:17:56 crc kubenswrapper[4926]: I0312 18:17:56.818371 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://a397bef079b1410b3294983dad25ada9109b1a0eac364c78c0ff4aeeccdf38ed" gracePeriod=600 Mar 12 18:17:57 crc kubenswrapper[4926]: I0312 18:17:57.765701 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="a397bef079b1410b3294983dad25ada9109b1a0eac364c78c0ff4aeeccdf38ed" exitCode=0 Mar 12 18:17:57 crc kubenswrapper[4926]: I0312 18:17:57.765788 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"a397bef079b1410b3294983dad25ada9109b1a0eac364c78c0ff4aeeccdf38ed"} Mar 12 18:17:57 crc kubenswrapper[4926]: I0312 18:17:57.765974 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"10c4816f4e2fc4ce2bc2183a633d9bc53980639515bfce0cf198e862b133fadb"} Mar 12 18:17:57 crc kubenswrapper[4926]: I0312 18:17:57.765994 4926 scope.go:117] "RemoveContainer" containerID="869151c9e3071e8f72a54f977df4fbec55cdf81d3f75158a024a468d8d420c6b" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.121570 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555658-gfgnc"] Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.122881 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.131905 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555658-gfgnc"] Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.132693 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.134510 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.136779 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.249730 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhr6\" (UniqueName: \"kubernetes.io/projected/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6-kube-api-access-hnhr6\") pod \"auto-csr-approver-29555658-gfgnc\" (UID: \"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6\") " pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.350724 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhr6\" (UniqueName: \"kubernetes.io/projected/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6-kube-api-access-hnhr6\") pod \"auto-csr-approver-29555658-gfgnc\" (UID: \"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6\") " pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.379751 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhr6\" (UniqueName: \"kubernetes.io/projected/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6-kube-api-access-hnhr6\") pod \"auto-csr-approver-29555658-gfgnc\" (UID: \"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6\") " pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.440035 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.657726 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555658-gfgnc"] Mar 12 18:18:00 crc kubenswrapper[4926]: I0312 18:18:00.786323 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" event={"ID":"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6","Type":"ContainerStarted","Data":"be8d63d1d24bbb247fb5937d6f970c255d35cfad381121636213a2bba32500dc"} Mar 12 18:18:02 crc kubenswrapper[4926]: I0312 18:18:02.806459 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" event={"ID":"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6","Type":"ContainerStarted","Data":"287cfd2e2e77a9b487765ae984d056f483b704272f97333956b53dc7f101b02d"} Mar 12 18:18:02 crc kubenswrapper[4926]: I0312 18:18:02.822072 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" podStartSLOduration=1.149782568 podStartE2EDuration="2.822055096s" podCreationTimestamp="2026-03-12 18:18:00 +0000 UTC" firstStartedPulling="2026-03-12 18:18:00.664755629 +0000 UTC m=+921.033381972" lastFinishedPulling="2026-03-12 18:18:02.337028167 +0000 UTC m=+922.705654500" observedRunningTime="2026-03-12 18:18:02.818156726 +0000 UTC m=+923.186783059" watchObservedRunningTime="2026-03-12 18:18:02.822055096 +0000 UTC m=+923.190681429" Mar 12 18:18:03 crc kubenswrapper[4926]: I0312 18:18:03.815334 4926 generic.go:334] "Generic (PLEG): container finished" podID="4b727d6f-6c9d-44ad-8594-accc9d2c4ed6" containerID="287cfd2e2e77a9b487765ae984d056f483b704272f97333956b53dc7f101b02d" exitCode=0 Mar 12 18:18:03 crc kubenswrapper[4926]: I0312 18:18:03.815401 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" event={"ID":"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6","Type":"ContainerDied","Data":"287cfd2e2e77a9b487765ae984d056f483b704272f97333956b53dc7f101b02d"} Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.133021 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.214847 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnhr6\" (UniqueName: \"kubernetes.io/projected/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6-kube-api-access-hnhr6\") pod \"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6\" (UID: \"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6\") " Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.225668 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6-kube-api-access-hnhr6" (OuterVolumeSpecName: "kube-api-access-hnhr6") pod "4b727d6f-6c9d-44ad-8594-accc9d2c4ed6" (UID: "4b727d6f-6c9d-44ad-8594-accc9d2c4ed6"). InnerVolumeSpecName "kube-api-access-hnhr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.316431 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnhr6\" (UniqueName: \"kubernetes.io/projected/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6-kube-api-access-hnhr6\") on node \"crc\" DevicePath \"\"" Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.348406 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79dbc878dc-w7f58" Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.830340 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" event={"ID":"4b727d6f-6c9d-44ad-8594-accc9d2c4ed6","Type":"ContainerDied","Data":"be8d63d1d24bbb247fb5937d6f970c255d35cfad381121636213a2bba32500dc"} Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.830383 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8d63d1d24bbb247fb5937d6f970c255d35cfad381121636213a2bba32500dc" Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.830461 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555658-gfgnc" Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.883625 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555652-mrcx4"] Mar 12 18:18:05 crc kubenswrapper[4926]: I0312 18:18:05.890901 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555652-mrcx4"] Mar 12 18:18:06 crc kubenswrapper[4926]: I0312 18:18:06.497970 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a135018f-2c21-4678-a0fc-9d6b62dda2d6" path="/var/lib/kubelet/pods/a135018f-2c21-4678-a0fc-9d6b62dda2d6/volumes" Mar 12 18:18:24 crc kubenswrapper[4926]: I0312 18:18:24.962673 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7cf578c5b8-z4gn8" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.694485 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d"] Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.694857 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b727d6f-6c9d-44ad-8594-accc9d2c4ed6" containerName="oc" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.694882 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b727d6f-6c9d-44ad-8594-accc9d2c4ed6" containerName="oc" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.695058 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b727d6f-6c9d-44ad-8594-accc9d2c4ed6" containerName="oc" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.695687 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.699029 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.699101 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2thch" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.701484 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9p72p"] Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.704350 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.706018 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.706252 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.710259 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d"] Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.774156 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6bt9q"] Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.775182 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.777510 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.777716 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.777830 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.777943 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t48cm" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794663 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-reloader\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794724 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-sockets\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794752 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjp96\" (UniqueName: \"kubernetes.io/projected/036c2795-2942-4cc8-9a91-6cc48cbe7521-kube-api-access-zjp96\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794777 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-startup\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794799 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-conf\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794821 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/036c2795-2942-4cc8-9a91-6cc48cbe7521-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794839 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794911 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhdm\" (UniqueName: \"kubernetes.io/projected/53da3fff-e3f4-4b9d-a887-f5a28f986107-kube-api-access-zzhdm\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.794933 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics-certs\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.797362 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-58dtw"] Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.798735 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.808072 4926 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.826828 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-58dtw"] Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.895990 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-sockets\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896041 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjp96\" (UniqueName: \"kubernetes.io/projected/036c2795-2942-4cc8-9a91-6cc48cbe7521-kube-api-access-zjp96\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896060 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-startup\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896078 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-conf\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896097 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896113 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/036c2795-2942-4cc8-9a91-6cc48cbe7521-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896134 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-cert\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896163 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896181 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxzrt\" (UniqueName: \"kubernetes.io/projected/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-kube-api-access-lxzrt\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896209 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-metrics-certs\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896286 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cq2\" (UniqueName: \"kubernetes.io/projected/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-kube-api-access-84cq2\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896305 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhdm\" (UniqueName: \"kubernetes.io/projected/53da3fff-e3f4-4b9d-a887-f5a28f986107-kube-api-access-zzhdm\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896320 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics-certs\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896347 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-metallb-excludel2\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896370 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-metrics-certs\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.896380 4926 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.896484 4926 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.896504 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036c2795-2942-4cc8-9a91-6cc48cbe7521-cert podName:036c2795-2942-4cc8-9a91-6cc48cbe7521 nodeName:}" failed. No retries permitted until 2026-03-12 18:18:26.396449129 +0000 UTC m=+946.765075472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/036c2795-2942-4cc8-9a91-6cc48cbe7521-cert") pod "frr-k8s-webhook-server-bcc4b6f68-j2n6d" (UID: "036c2795-2942-4cc8-9a91-6cc48cbe7521") : secret "frr-k8s-webhook-server-cert" not found Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.896535 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics-certs podName:53da3fff-e3f4-4b9d-a887-f5a28f986107 nodeName:}" failed. No retries permitted until 2026-03-12 18:18:26.396514781 +0000 UTC m=+946.765141114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics-certs") pod "frr-k8s-9p72p" (UID: "53da3fff-e3f4-4b9d-a887-f5a28f986107") : secret "frr-k8s-certs-secret" not found Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896492 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-sockets\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896551 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-reloader\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896621 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-conf\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896697 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.896826 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/53da3fff-e3f4-4b9d-a887-f5a28f986107-reloader\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.897131 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/53da3fff-e3f4-4b9d-a887-f5a28f986107-frr-startup\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.917337 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhdm\" (UniqueName: \"kubernetes.io/projected/53da3fff-e3f4-4b9d-a887-f5a28f986107-kube-api-access-zzhdm\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.918094 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjp96\" (UniqueName: \"kubernetes.io/projected/036c2795-2942-4cc8-9a91-6cc48cbe7521-kube-api-access-zjp96\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998342 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84cq2\" (UniqueName: \"kubernetes.io/projected/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-kube-api-access-84cq2\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998415 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-metallb-excludel2\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998462 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-metrics-certs\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998531 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-cert\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998559 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998580 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxzrt\" (UniqueName: \"kubernetes.io/projected/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-kube-api-access-lxzrt\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.998613 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-metrics-certs\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.998819 4926 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 18:18:25 crc kubenswrapper[4926]: E0312 18:18:25.998910 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist podName:d10f0ca7-6fc4-4e6a-815c-ad5a1db16350 nodeName:}" failed. No retries permitted until 2026-03-12 18:18:26.498888876 +0000 UTC m=+946.867515219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist") pod "speaker-6bt9q" (UID: "d10f0ca7-6fc4-4e6a-815c-ad5a1db16350") : secret "metallb-memberlist" not found Mar 12 18:18:25 crc kubenswrapper[4926]: I0312 18:18:25.999279 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-metallb-excludel2\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.001878 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-metrics-certs\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.002138 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-metrics-certs\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.002318 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-cert\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.016199 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cq2\" (UniqueName: \"kubernetes.io/projected/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-kube-api-access-84cq2\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.026959 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxzrt\" (UniqueName: \"kubernetes.io/projected/16ea4f33-38fb-42c7-9c85-67c443f0b3a4-kube-api-access-lxzrt\") pod \"controller-7bb4cc7c98-58dtw\" (UID: \"16ea4f33-38fb-42c7-9c85-67c443f0b3a4\") " pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.128177 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.403399 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics-certs\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.403811 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/036c2795-2942-4cc8-9a91-6cc48cbe7521-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.409003 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53da3fff-e3f4-4b9d-a887-f5a28f986107-metrics-certs\") pod \"frr-k8s-9p72p\" (UID: \"53da3fff-e3f4-4b9d-a887-f5a28f986107\") " pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.409576 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/036c2795-2942-4cc8-9a91-6cc48cbe7521-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-j2n6d\" (UID: \"036c2795-2942-4cc8-9a91-6cc48cbe7521\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.505535 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:26 crc kubenswrapper[4926]: E0312 18:18:26.505684 4926 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 18:18:26 crc kubenswrapper[4926]: E0312 18:18:26.505737 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist podName:d10f0ca7-6fc4-4e6a-815c-ad5a1db16350 nodeName:}" failed. No retries permitted until 2026-03-12 18:18:27.505720805 +0000 UTC m=+947.874347138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist") pod "speaker-6bt9q" (UID: "d10f0ca7-6fc4-4e6a-815c-ad5a1db16350") : secret "metallb-memberlist" not found Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.537710 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-58dtw"] Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.622724 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.638880 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.861401 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d"] Mar 12 18:18:26 crc kubenswrapper[4926]: W0312 18:18:26.864909 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod036c2795_2942_4cc8_9a91_6cc48cbe7521.slice/crio-af71a557196e989147aaf222c4407e462d264b77648a93a33cb0b35b3d989008 WatchSource:0}: Error finding container af71a557196e989147aaf222c4407e462d264b77648a93a33cb0b35b3d989008: Status 404 returned error can't find the container with id af71a557196e989147aaf222c4407e462d264b77648a93a33cb0b35b3d989008 Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.975012 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"7bc9d3477ec1337148248500fa8f077509e6af660b9e36f31ca9c796ca21db3c"} Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.976154 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" event={"ID":"036c2795-2942-4cc8-9a91-6cc48cbe7521","Type":"ContainerStarted","Data":"af71a557196e989147aaf222c4407e462d264b77648a93a33cb0b35b3d989008"} Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.978325 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-58dtw" event={"ID":"16ea4f33-38fb-42c7-9c85-67c443f0b3a4","Type":"ContainerStarted","Data":"03c9a6572f409b14b5d79d225c8d698d54ce5f2ad0d5248d916c606a8887b589"} Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.978369 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-58dtw" event={"ID":"16ea4f33-38fb-42c7-9c85-67c443f0b3a4","Type":"ContainerStarted","Data":"a85373b45dbbe732c0eba8edc211072a173bb3fa500977a7dc241c0dd6a3a6ca"} Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.978380 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-58dtw" event={"ID":"16ea4f33-38fb-42c7-9c85-67c443f0b3a4","Type":"ContainerStarted","Data":"51a47c9019e7bdcbf4b49d7c372cbd35b4692e9ef8de14fcd80d7f16b9e28168"} Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.978471 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:26 crc kubenswrapper[4926]: I0312 18:18:26.994816 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-58dtw" podStartSLOduration=1.994801211 podStartE2EDuration="1.994801211s" podCreationTimestamp="2026-03-12 18:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:18:26.99059688 +0000 UTC m=+947.359223213" watchObservedRunningTime="2026-03-12 18:18:26.994801211 +0000 UTC m=+947.363427544" Mar 12 18:18:27 crc kubenswrapper[4926]: I0312 18:18:27.522684 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:27 crc kubenswrapper[4926]: I0312 18:18:27.529625 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d10f0ca7-6fc4-4e6a-815c-ad5a1db16350-memberlist\") pod \"speaker-6bt9q\" (UID: \"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350\") " pod="metallb-system/speaker-6bt9q" Mar 12 18:18:27 crc kubenswrapper[4926]: I0312 18:18:27.591010 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6bt9q" Mar 12 18:18:27 crc kubenswrapper[4926]: I0312 18:18:27.986728 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6bt9q" event={"ID":"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350","Type":"ContainerStarted","Data":"479de8f9c60ff091edbc68c3a1ec82bf8519f9a0c67baefdb7b6e478740c5685"} Mar 12 18:18:27 crc kubenswrapper[4926]: I0312 18:18:27.986797 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6bt9q" event={"ID":"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350","Type":"ContainerStarted","Data":"cffa8917727a15408f12c2bf94d6fd73ecf0e805a71196c59b24ca6f44afb9b8"} Mar 12 18:18:28 crc kubenswrapper[4926]: I0312 18:18:28.994348 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6bt9q" event={"ID":"d10f0ca7-6fc4-4e6a-815c-ad5a1db16350","Type":"ContainerStarted","Data":"76ce3956f3b260add7de62d4cd4e7373ef3bf207020bbea3a061106566998243"} Mar 12 18:18:28 crc kubenswrapper[4926]: I0312 18:18:28.994619 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6bt9q" Mar 12 18:18:29 crc kubenswrapper[4926]: I0312 18:18:29.020791 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6bt9q" podStartSLOduration=4.020768243 podStartE2EDuration="4.020768243s" podCreationTimestamp="2026-03-12 18:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:18:29.016134919 +0000 UTC m=+949.384761252" watchObservedRunningTime="2026-03-12 18:18:29.020768243 +0000 UTC m=+949.389394576" Mar 12 18:18:36 crc kubenswrapper[4926]: I0312 18:18:36.133160 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-58dtw" Mar 12 18:18:37 crc kubenswrapper[4926]: I0312 18:18:37.090672 4926 generic.go:334] "Generic (PLEG): container finished" podID="53da3fff-e3f4-4b9d-a887-f5a28f986107" containerID="764bc52e4da56ad2ecb50ab328be575e043ec37d17dee06fbb12ab36d7af44c8" exitCode=0 Mar 12 18:18:37 crc kubenswrapper[4926]: I0312 18:18:37.090751 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerDied","Data":"764bc52e4da56ad2ecb50ab328be575e043ec37d17dee06fbb12ab36d7af44c8"} Mar 12 18:18:37 crc kubenswrapper[4926]: I0312 18:18:37.092510 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" event={"ID":"036c2795-2942-4cc8-9a91-6cc48cbe7521","Type":"ContainerStarted","Data":"c699b6568bed00fc040b1d65e6af418f4fdbc86390fce722ce49b05c81835a8b"} Mar 12 18:18:37 crc kubenswrapper[4926]: I0312 18:18:37.092713 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:37 crc kubenswrapper[4926]: I0312 18:18:37.154999 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" podStartSLOduration=3.092244015 podStartE2EDuration="12.154981505s" podCreationTimestamp="2026-03-12 18:18:25 +0000 UTC" firstStartedPulling="2026-03-12 18:18:26.867938954 +0000 UTC m=+947.236565287" lastFinishedPulling="2026-03-12 18:18:35.930676434 +0000 UTC m=+956.299302777" observedRunningTime="2026-03-12 18:18:37.153823379 +0000 UTC m=+957.522449712" watchObservedRunningTime="2026-03-12 18:18:37.154981505 +0000 UTC m=+957.523607838" Mar 12 18:18:37 crc kubenswrapper[4926]: I0312 18:18:37.596012 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6bt9q" Mar 12 18:18:38 crc kubenswrapper[4926]: I0312 18:18:38.103804 4926 generic.go:334] "Generic (PLEG): container finished" podID="53da3fff-e3f4-4b9d-a887-f5a28f986107" containerID="060ed7cddeb7f6b2a72d027d9728b1971913869840a312f2fc04fea89d0d6142" exitCode=0 Mar 12 18:18:38 crc kubenswrapper[4926]: I0312 18:18:38.103906 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerDied","Data":"060ed7cddeb7f6b2a72d027d9728b1971913869840a312f2fc04fea89d0d6142"} Mar 12 18:18:39 crc kubenswrapper[4926]: I0312 18:18:39.111942 4926 generic.go:334] "Generic (PLEG): container finished" podID="53da3fff-e3f4-4b9d-a887-f5a28f986107" containerID="b36c63abfe5b08cdc3a148b062643ebd6e582fffc5989f4a952be3fd61137963" exitCode=0 Mar 12 18:18:39 crc kubenswrapper[4926]: I0312 18:18:39.111983 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerDied","Data":"b36c63abfe5b08cdc3a148b062643ebd6e582fffc5989f4a952be3fd61137963"} Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.122731 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"bcfb93ea45aae8696c20fa8649694239781a5bd2ecd56d2e8e91e41a85b4384d"} Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.123056 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"1d8f0804408b03049e96fc9415277c1c2a291f83e8b7cd21c0fe8809bc68868e"} Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.123069 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"204cb34230ede5855ffb23a7568e4a8fe97836d9b043d4c2ce1417955a626e5e"} Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.123078 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"42dd16964c7e152638199f8c806180078e0b4b5b008d51c7b01231e9f537b3c5"} Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.123087 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"484ec29a538864455de1aa755032e7b37e46e22e97117bd712533afa9a92e162"} Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.265120 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vj2fp"] Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.265982 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.269294 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-sqgdz" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.269660 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.270278 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.278790 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vj2fp"] Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.411112 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5nb\" (UniqueName: \"kubernetes.io/projected/c7718f63-5b8f-4137-8a80-f8793ce3e60f-kube-api-access-js5nb\") pod \"openstack-operator-index-vj2fp\" (UID: \"c7718f63-5b8f-4137-8a80-f8793ce3e60f\") " pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.512574 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5nb\" (UniqueName: \"kubernetes.io/projected/c7718f63-5b8f-4137-8a80-f8793ce3e60f-kube-api-access-js5nb\") pod \"openstack-operator-index-vj2fp\" (UID: \"c7718f63-5b8f-4137-8a80-f8793ce3e60f\") " pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.525471 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.535844 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.561815 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5nb\" (UniqueName: \"kubernetes.io/projected/c7718f63-5b8f-4137-8a80-f8793ce3e60f-kube-api-access-js5nb\") pod \"openstack-operator-index-vj2fp\" (UID: \"c7718f63-5b8f-4137-8a80-f8793ce3e60f\") " pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.640325 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-sqgdz" Mar 12 18:18:40 crc kubenswrapper[4926]: I0312 18:18:40.649685 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.058162 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vj2fp"] Mar 12 18:18:41 crc kubenswrapper[4926]: W0312 18:18:41.066534 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7718f63_5b8f_4137_8a80_f8793ce3e60f.slice/crio-d25560d3728b5ebac261cd626d37b593240ae569a7e8134014f5988620070bba WatchSource:0}: Error finding container d25560d3728b5ebac261cd626d37b593240ae569a7e8134014f5988620070bba: Status 404 returned error can't find the container with id d25560d3728b5ebac261cd626d37b593240ae569a7e8134014f5988620070bba Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.130290 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vj2fp" event={"ID":"c7718f63-5b8f-4137-8a80-f8793ce3e60f","Type":"ContainerStarted","Data":"d25560d3728b5ebac261cd626d37b593240ae569a7e8134014f5988620070bba"} Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.134301 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9p72p" event={"ID":"53da3fff-e3f4-4b9d-a887-f5a28f986107","Type":"ContainerStarted","Data":"0df53e49a20875db19a35b9d447319dca5a8e19fed241dcaf1ecbff4e2511284"} Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.134607 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.159540 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9p72p" podStartSLOduration=7.009592831 podStartE2EDuration="16.159526365s" podCreationTimestamp="2026-03-12 18:18:25 +0000 UTC" firstStartedPulling="2026-03-12 18:18:26.780952477 +0000 UTC m=+947.149578810" lastFinishedPulling="2026-03-12 18:18:35.930886001 +0000 UTC m=+956.299512344" observedRunningTime="2026-03-12 18:18:41.15776978 +0000 UTC m=+961.526396143" watchObservedRunningTime="2026-03-12 18:18:41.159526365 +0000 UTC m=+961.528152698" Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.640511 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:41 crc kubenswrapper[4926]: I0312 18:18:41.689212 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:43 crc kubenswrapper[4926]: I0312 18:18:43.667602 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vj2fp"] Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.154993 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vj2fp" event={"ID":"c7718f63-5b8f-4137-8a80-f8793ce3e60f","Type":"ContainerStarted","Data":"78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3"} Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.155484 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vj2fp" podUID="c7718f63-5b8f-4137-8a80-f8793ce3e60f" containerName="registry-server" containerID="cri-o://78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3" gracePeriod=2 Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.185291 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vj2fp" podStartSLOduration=1.7441221439999999 podStartE2EDuration="4.185269143s" podCreationTimestamp="2026-03-12 18:18:40 +0000 UTC" firstStartedPulling="2026-03-12 18:18:41.069189505 +0000 UTC m=+961.437815838" lastFinishedPulling="2026-03-12 18:18:43.510336504 +0000 UTC m=+963.878962837" observedRunningTime="2026-03-12 18:18:44.179869314 +0000 UTC m=+964.548495647" watchObservedRunningTime="2026-03-12 18:18:44.185269143 +0000 UTC m=+964.553895506" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.268952 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hwc2c"] Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.270520 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.278763 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hwc2c"] Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.370964 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2h8p\" (UniqueName: \"kubernetes.io/projected/66ae05ba-2fc2-4915-9899-083a49295427-kube-api-access-p2h8p\") pod \"openstack-operator-index-hwc2c\" (UID: \"66ae05ba-2fc2-4915-9899-083a49295427\") " pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.473531 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2h8p\" (UniqueName: \"kubernetes.io/projected/66ae05ba-2fc2-4915-9899-083a49295427-kube-api-access-p2h8p\") pod \"openstack-operator-index-hwc2c\" (UID: \"66ae05ba-2fc2-4915-9899-083a49295427\") " pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.495746 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2h8p\" (UniqueName: \"kubernetes.io/projected/66ae05ba-2fc2-4915-9899-083a49295427-kube-api-access-p2h8p\") pod \"openstack-operator-index-hwc2c\" (UID: \"66ae05ba-2fc2-4915-9899-083a49295427\") " pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.521750 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.574408 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js5nb\" (UniqueName: \"kubernetes.io/projected/c7718f63-5b8f-4137-8a80-f8793ce3e60f-kube-api-access-js5nb\") pod \"c7718f63-5b8f-4137-8a80-f8793ce3e60f\" (UID: \"c7718f63-5b8f-4137-8a80-f8793ce3e60f\") " Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.578208 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7718f63-5b8f-4137-8a80-f8793ce3e60f-kube-api-access-js5nb" (OuterVolumeSpecName: "kube-api-access-js5nb") pod "c7718f63-5b8f-4137-8a80-f8793ce3e60f" (UID: "c7718f63-5b8f-4137-8a80-f8793ce3e60f"). InnerVolumeSpecName "kube-api-access-js5nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.621596 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.675711 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js5nb\" (UniqueName: \"kubernetes.io/projected/c7718f63-5b8f-4137-8a80-f8793ce3e60f-kube-api-access-js5nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:18:44 crc kubenswrapper[4926]: I0312 18:18:44.851791 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hwc2c"] Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.166155 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hwc2c" event={"ID":"66ae05ba-2fc2-4915-9899-083a49295427","Type":"ContainerStarted","Data":"6fc5bf8926e75ff0040ebbba4ef2d374a3641b9d4ffaaa604bc860ce9ed9fd30"} Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.166809 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hwc2c" event={"ID":"66ae05ba-2fc2-4915-9899-083a49295427","Type":"ContainerStarted","Data":"6cccf55ee6fc80f4119c9218e0e40321791ca8a589afdaf0b6a46644e7580f06"} Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.169965 4926 generic.go:334] "Generic (PLEG): container finished" podID="c7718f63-5b8f-4137-8a80-f8793ce3e60f" containerID="78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3" exitCode=0 Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.169994 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vj2fp" Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.170068 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vj2fp" event={"ID":"c7718f63-5b8f-4137-8a80-f8793ce3e60f","Type":"ContainerDied","Data":"78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3"} Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.170171 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vj2fp" event={"ID":"c7718f63-5b8f-4137-8a80-f8793ce3e60f","Type":"ContainerDied","Data":"d25560d3728b5ebac261cd626d37b593240ae569a7e8134014f5988620070bba"} Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.170208 4926 scope.go:117] "RemoveContainer" containerID="78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3" Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.192111 4926 scope.go:117] "RemoveContainer" containerID="78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3" Mar 12 18:18:45 crc kubenswrapper[4926]: E0312 18:18:45.193058 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3\": container with ID starting with 78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3 not found: ID does not exist" containerID="78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3" Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.193153 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3"} err="failed to get container status \"78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3\": rpc error: code = NotFound desc = could not find container \"78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3\": container with ID starting with 78cc23ad26fef8004b936ce6e2b86f5bce9f33915b9e8b8f133965ac739d7ff3 not found: ID does not exist" Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.194294 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hwc2c" podStartSLOduration=1.146188808 podStartE2EDuration="1.194270854s" podCreationTimestamp="2026-03-12 18:18:44 +0000 UTC" firstStartedPulling="2026-03-12 18:18:44.877060475 +0000 UTC m=+965.245686808" lastFinishedPulling="2026-03-12 18:18:44.925142521 +0000 UTC m=+965.293768854" observedRunningTime="2026-03-12 18:18:45.18736935 +0000 UTC m=+965.555995723" watchObservedRunningTime="2026-03-12 18:18:45.194270854 +0000 UTC m=+965.562897227" Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.214758 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vj2fp"] Mar 12 18:18:45 crc kubenswrapper[4926]: I0312 18:18:45.223703 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vj2fp"] Mar 12 18:18:46 crc kubenswrapper[4926]: I0312 18:18:46.501250 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7718f63-5b8f-4137-8a80-f8793ce3e60f" path="/var/lib/kubelet/pods/c7718f63-5b8f-4137-8a80-f8793ce3e60f/volumes" Mar 12 18:18:46 crc kubenswrapper[4926]: I0312 18:18:46.629224 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-j2n6d" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.874877 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tn2nt"] Mar 12 18:18:51 crc kubenswrapper[4926]: E0312 18:18:51.875563 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7718f63-5b8f-4137-8a80-f8793ce3e60f" containerName="registry-server" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.875584 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7718f63-5b8f-4137-8a80-f8793ce3e60f" containerName="registry-server" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.875829 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7718f63-5b8f-4137-8a80-f8793ce3e60f" containerName="registry-server" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.877177 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.898023 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tn2nt"] Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.994186 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-catalog-content\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.994254 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6r7m\" (UniqueName: \"kubernetes.io/projected/c3d1b5b0-e990-4051-884a-8b1bfb331618-kube-api-access-h6r7m\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:51 crc kubenswrapper[4926]: I0312 18:18:51.994287 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-utilities\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.095061 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-utilities\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.095196 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-catalog-content\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.095269 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6r7m\" (UniqueName: \"kubernetes.io/projected/c3d1b5b0-e990-4051-884a-8b1bfb331618-kube-api-access-h6r7m\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.095827 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-catalog-content\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.095849 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-utilities\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.132613 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6r7m\" (UniqueName: \"kubernetes.io/projected/c3d1b5b0-e990-4051-884a-8b1bfb331618-kube-api-access-h6r7m\") pod \"community-operators-tn2nt\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:52 crc kubenswrapper[4926]: I0312 18:18:52.226851 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:18:53 crc kubenswrapper[4926]: I0312 18:18:53.192197 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tn2nt"] Mar 12 18:18:53 crc kubenswrapper[4926]: W0312 18:18:53.200690 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3d1b5b0_e990_4051_884a_8b1bfb331618.slice/crio-6f4a9a1bcdab9a6ab28add7cca0c853a121afbf4dfa55c723b2f982597c64d37 WatchSource:0}: Error finding container 6f4a9a1bcdab9a6ab28add7cca0c853a121afbf4dfa55c723b2f982597c64d37: Status 404 returned error can't find the container with id 6f4a9a1bcdab9a6ab28add7cca0c853a121afbf4dfa55c723b2f982597c64d37 Mar 12 18:18:53 crc kubenswrapper[4926]: I0312 18:18:53.249125 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerStarted","Data":"6f4a9a1bcdab9a6ab28add7cca0c853a121afbf4dfa55c723b2f982597c64d37"} Mar 12 18:18:54 crc kubenswrapper[4926]: I0312 18:18:54.260897 4926 generic.go:334] "Generic (PLEG): container finished" podID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerID="7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88" exitCode=0 Mar 12 18:18:54 crc kubenswrapper[4926]: I0312 18:18:54.262152 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerDied","Data":"7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88"} Mar 12 18:18:54 crc kubenswrapper[4926]: I0312 18:18:54.622607 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:54 crc kubenswrapper[4926]: I0312 18:18:54.622666 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:54 crc kubenswrapper[4926]: I0312 18:18:54.653657 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:55 crc kubenswrapper[4926]: I0312 18:18:55.315995 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hwc2c" Mar 12 18:18:56 crc kubenswrapper[4926]: I0312 18:18:56.287811 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerStarted","Data":"ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3"} Mar 12 18:18:56 crc kubenswrapper[4926]: I0312 18:18:56.642217 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9p72p" Mar 12 18:18:57 crc kubenswrapper[4926]: I0312 18:18:57.296170 4926 generic.go:334] "Generic (PLEG): container finished" podID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerID="ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3" exitCode=0 Mar 12 18:18:57 crc kubenswrapper[4926]: I0312 18:18:57.296226 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerDied","Data":"ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3"} Mar 12 18:18:58 crc kubenswrapper[4926]: I0312 18:18:58.306900 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerStarted","Data":"3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911"} Mar 12 18:19:01 crc kubenswrapper[4926]: I0312 18:19:01.237102 4926 scope.go:117] "RemoveContainer" containerID="80892424cda12377cff2432a54ca031087805d38670f04b3a339fc6f8678512c" Mar 12 18:19:01 crc kubenswrapper[4926]: I0312 18:19:01.936675 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tn2nt" podStartSLOduration=7.484299345 podStartE2EDuration="10.936657235s" podCreationTimestamp="2026-03-12 18:18:51 +0000 UTC" firstStartedPulling="2026-03-12 18:18:54.262818756 +0000 UTC m=+974.631445099" lastFinishedPulling="2026-03-12 18:18:57.715176646 +0000 UTC m=+978.083802989" observedRunningTime="2026-03-12 18:18:58.335065411 +0000 UTC m=+978.703691754" watchObservedRunningTime="2026-03-12 18:19:01.936657235 +0000 UTC m=+982.305283578" Mar 12 18:19:01 crc kubenswrapper[4926]: I0312 18:19:01.939425 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt"] Mar 12 18:19:01 crc kubenswrapper[4926]: I0312 18:19:01.940832 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:01 crc kubenswrapper[4926]: I0312 18:19:01.944271 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xfbfb" Mar 12 18:19:01 crc kubenswrapper[4926]: I0312 18:19:01.964719 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt"] Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.093779 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-bundle\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.093834 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27czd\" (UniqueName: \"kubernetes.io/projected/1a47463c-c539-4a5c-a3fd-c09feef2de67-kube-api-access-27czd\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.094288 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-util\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.195090 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-util\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.195351 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-bundle\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.195398 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27czd\" (UniqueName: \"kubernetes.io/projected/1a47463c-c539-4a5c-a3fd-c09feef2de67-kube-api-access-27czd\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.196052 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-util\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.196055 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-bundle\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.218279 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27czd\" (UniqueName: \"kubernetes.io/projected/1a47463c-c539-4a5c-a3fd-c09feef2de67-kube-api-access-27czd\") pod \"0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.227397 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.227467 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.263858 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.280737 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.369997 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:19:02 crc kubenswrapper[4926]: I0312 18:19:02.693210 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt"] Mar 12 18:19:02 crc kubenswrapper[4926]: W0312 18:19:02.700664 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a47463c_c539_4a5c_a3fd_c09feef2de67.slice/crio-0a67f29c83a4a856b4c6dea6d1a3b835900776ea48c9acafa40db5354f48f0b3 WatchSource:0}: Error finding container 0a67f29c83a4a856b4c6dea6d1a3b835900776ea48c9acafa40db5354f48f0b3: Status 404 returned error can't find the container with id 0a67f29c83a4a856b4c6dea6d1a3b835900776ea48c9acafa40db5354f48f0b3 Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.273248 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v54tn"] Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.276541 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.295326 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v54tn"] Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.324917 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-utilities\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.324998 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-catalog-content\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.325091 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q26vh\" (UniqueName: \"kubernetes.io/projected/b1bc86da-e204-472f-9744-af11c0e8a265-kube-api-access-q26vh\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.339772 4926 generic.go:334] "Generic (PLEG): container finished" podID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerID="7db1301999f83d9d375224675ed3da76f79f97b39dcee8b522fb3ee989cf9aa9" exitCode=0 Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.341218 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" event={"ID":"1a47463c-c539-4a5c-a3fd-c09feef2de67","Type":"ContainerDied","Data":"7db1301999f83d9d375224675ed3da76f79f97b39dcee8b522fb3ee989cf9aa9"} Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.341264 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" event={"ID":"1a47463c-c539-4a5c-a3fd-c09feef2de67","Type":"ContainerStarted","Data":"0a67f29c83a4a856b4c6dea6d1a3b835900776ea48c9acafa40db5354f48f0b3"} Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.426057 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-catalog-content\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.426165 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q26vh\" (UniqueName: \"kubernetes.io/projected/b1bc86da-e204-472f-9744-af11c0e8a265-kube-api-access-q26vh\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.426232 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-utilities\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.426834 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-utilities\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.427277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-catalog-content\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.455929 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q26vh\" (UniqueName: \"kubernetes.io/projected/b1bc86da-e204-472f-9744-af11c0e8a265-kube-api-access-q26vh\") pod \"certified-operators-v54tn\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.619534 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:03 crc kubenswrapper[4926]: I0312 18:19:03.842446 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v54tn"] Mar 12 18:19:04 crc kubenswrapper[4926]: I0312 18:19:04.350656 4926 generic.go:334] "Generic (PLEG): container finished" podID="b1bc86da-e204-472f-9744-af11c0e8a265" containerID="88e86e609584800aecd00ec27240db80609a2d80782a4ff3069326cd7151ac45" exitCode=0 Mar 12 18:19:04 crc kubenswrapper[4926]: I0312 18:19:04.350735 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerDied","Data":"88e86e609584800aecd00ec27240db80609a2d80782a4ff3069326cd7151ac45"} Mar 12 18:19:04 crc kubenswrapper[4926]: I0312 18:19:04.350947 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerStarted","Data":"26d978bbd24802db4c024d84e6fb86f4ea3699958fdd595fabcd49191a4e9809"} Mar 12 18:19:04 crc kubenswrapper[4926]: I0312 18:19:04.352667 4926 generic.go:334] "Generic (PLEG): container finished" podID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerID="ab2124d1cf4401d3e3c56ad6c8da750643ef586b4af6bfe3671baf43b473f054" exitCode=0 Mar 12 18:19:04 crc kubenswrapper[4926]: I0312 18:19:04.352687 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" event={"ID":"1a47463c-c539-4a5c-a3fd-c09feef2de67","Type":"ContainerDied","Data":"ab2124d1cf4401d3e3c56ad6c8da750643ef586b4af6bfe3671baf43b473f054"} Mar 12 18:19:05 crc kubenswrapper[4926]: I0312 18:19:05.359969 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerStarted","Data":"8f6bfe79045806d74f84405e9ff1f3a84c03533efa8e6773bb09b52e303ff4e2"} Mar 12 18:19:05 crc kubenswrapper[4926]: I0312 18:19:05.363252 4926 generic.go:334] "Generic (PLEG): container finished" podID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerID="8da1faac0601d97457d68329ffd4b3a2e386575abb53cf5c7eb97ea42d6b0cd8" exitCode=0 Mar 12 18:19:05 crc kubenswrapper[4926]: I0312 18:19:05.363290 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" event={"ID":"1a47463c-c539-4a5c-a3fd-c09feef2de67","Type":"ContainerDied","Data":"8da1faac0601d97457d68329ffd4b3a2e386575abb53cf5c7eb97ea42d6b0cd8"} Mar 12 18:19:06 crc kubenswrapper[4926]: I0312 18:19:06.375501 4926 generic.go:334] "Generic (PLEG): container finished" podID="b1bc86da-e204-472f-9744-af11c0e8a265" containerID="8f6bfe79045806d74f84405e9ff1f3a84c03533efa8e6773bb09b52e303ff4e2" exitCode=0 Mar 12 18:19:06 crc kubenswrapper[4926]: I0312 18:19:06.375576 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerDied","Data":"8f6bfe79045806d74f84405e9ff1f3a84c03533efa8e6773bb09b52e303ff4e2"} Mar 12 18:19:06 crc kubenswrapper[4926]: I0312 18:19:06.848939 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tn2nt"] Mar 12 18:19:06 crc kubenswrapper[4926]: I0312 18:19:06.849212 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tn2nt" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="registry-server" containerID="cri-o://3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911" gracePeriod=2 Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.015723 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.131722 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-bundle\") pod \"1a47463c-c539-4a5c-a3fd-c09feef2de67\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.131818 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27czd\" (UniqueName: \"kubernetes.io/projected/1a47463c-c539-4a5c-a3fd-c09feef2de67-kube-api-access-27czd\") pod \"1a47463c-c539-4a5c-a3fd-c09feef2de67\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.131888 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-util\") pod \"1a47463c-c539-4a5c-a3fd-c09feef2de67\" (UID: \"1a47463c-c539-4a5c-a3fd-c09feef2de67\") " Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.132557 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-bundle" (OuterVolumeSpecName: "bundle") pod "1a47463c-c539-4a5c-a3fd-c09feef2de67" (UID: "1a47463c-c539-4a5c-a3fd-c09feef2de67"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.137668 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a47463c-c539-4a5c-a3fd-c09feef2de67-kube-api-access-27czd" (OuterVolumeSpecName: "kube-api-access-27czd") pod "1a47463c-c539-4a5c-a3fd-c09feef2de67" (UID: "1a47463c-c539-4a5c-a3fd-c09feef2de67"). InnerVolumeSpecName "kube-api-access-27czd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.155289 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-util" (OuterVolumeSpecName: "util") pod "1a47463c-c539-4a5c-a3fd-c09feef2de67" (UID: "1a47463c-c539-4a5c-a3fd-c09feef2de67"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.232771 4926 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-util\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.232801 4926 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a47463c-c539-4a5c-a3fd-c09feef2de67-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.232811 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27czd\" (UniqueName: \"kubernetes.io/projected/1a47463c-c539-4a5c-a3fd-c09feef2de67-kube-api-access-27czd\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.234356 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.333920 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-catalog-content\") pod \"c3d1b5b0-e990-4051-884a-8b1bfb331618\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.333998 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6r7m\" (UniqueName: \"kubernetes.io/projected/c3d1b5b0-e990-4051-884a-8b1bfb331618-kube-api-access-h6r7m\") pod \"c3d1b5b0-e990-4051-884a-8b1bfb331618\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.334041 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-utilities\") pod \"c3d1b5b0-e990-4051-884a-8b1bfb331618\" (UID: \"c3d1b5b0-e990-4051-884a-8b1bfb331618\") " Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.334892 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-utilities" (OuterVolumeSpecName: "utilities") pod "c3d1b5b0-e990-4051-884a-8b1bfb331618" (UID: "c3d1b5b0-e990-4051-884a-8b1bfb331618"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.335062 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.337141 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d1b5b0-e990-4051-884a-8b1bfb331618-kube-api-access-h6r7m" (OuterVolumeSpecName: "kube-api-access-h6r7m") pod "c3d1b5b0-e990-4051-884a-8b1bfb331618" (UID: "c3d1b5b0-e990-4051-884a-8b1bfb331618"). InnerVolumeSpecName "kube-api-access-h6r7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.388640 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerStarted","Data":"2a931491103b2d031737572ba24dfcf764dd2ec0c44c2ffec92cc6d03c2bd8ba"} Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.391027 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" event={"ID":"1a47463c-c539-4a5c-a3fd-c09feef2de67","Type":"ContainerDied","Data":"0a67f29c83a4a856b4c6dea6d1a3b835900776ea48c9acafa40db5354f48f0b3"} Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.391059 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a67f29c83a4a856b4c6dea6d1a3b835900776ea48c9acafa40db5354f48f0b3" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.391108 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.395381 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3d1b5b0-e990-4051-884a-8b1bfb331618" (UID: "c3d1b5b0-e990-4051-884a-8b1bfb331618"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.397049 4926 generic.go:334] "Generic (PLEG): container finished" podID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerID="3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911" exitCode=0 Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.397079 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerDied","Data":"3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911"} Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.397099 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tn2nt" event={"ID":"c3d1b5b0-e990-4051-884a-8b1bfb331618","Type":"ContainerDied","Data":"6f4a9a1bcdab9a6ab28add7cca0c853a121afbf4dfa55c723b2f982597c64d37"} Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.397115 4926 scope.go:117] "RemoveContainer" containerID="3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.397213 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tn2nt" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.428130 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v54tn" podStartSLOduration=1.8928752690000001 podStartE2EDuration="4.428113045s" podCreationTimestamp="2026-03-12 18:19:03 +0000 UTC" firstStartedPulling="2026-03-12 18:19:04.352776585 +0000 UTC m=+984.721402918" lastFinishedPulling="2026-03-12 18:19:06.888014321 +0000 UTC m=+987.256640694" observedRunningTime="2026-03-12 18:19:07.424239605 +0000 UTC m=+987.792865938" watchObservedRunningTime="2026-03-12 18:19:07.428113045 +0000 UTC m=+987.796739378" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.431812 4926 scope.go:117] "RemoveContainer" containerID="ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.444845 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6r7m\" (UniqueName: \"kubernetes.io/projected/c3d1b5b0-e990-4051-884a-8b1bfb331618-kube-api-access-h6r7m\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.444983 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3d1b5b0-e990-4051-884a-8b1bfb331618-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.447471 4926 scope.go:117] "RemoveContainer" containerID="7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.464951 4926 scope.go:117] "RemoveContainer" containerID="3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911" Mar 12 18:19:07 crc kubenswrapper[4926]: E0312 18:19:07.465508 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911\": container with ID starting with 3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911 not found: ID does not exist" containerID="3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.465537 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911"} err="failed to get container status \"3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911\": rpc error: code = NotFound desc = could not find container \"3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911\": container with ID starting with 3c291b723d1fcde95eb414f900dad95e6d41bc5c05e6a755da9ff76633773911 not found: ID does not exist" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.465560 4926 scope.go:117] "RemoveContainer" containerID="ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.465622 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tn2nt"] Mar 12 18:19:07 crc kubenswrapper[4926]: E0312 18:19:07.465763 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3\": container with ID starting with ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3 not found: ID does not exist" containerID="ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.465792 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3"} err="failed to get container status \"ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3\": rpc error: code = NotFound desc = could not find container \"ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3\": container with ID starting with ab4205b30f9b012dcfa5b4992c8916cbd40cecadcc8d510b5930cd515376aff3 not found: ID does not exist" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.465828 4926 scope.go:117] "RemoveContainer" containerID="7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88" Mar 12 18:19:07 crc kubenswrapper[4926]: E0312 18:19:07.466627 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88\": container with ID starting with 7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88 not found: ID does not exist" containerID="7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.466666 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88"} err="failed to get container status \"7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88\": rpc error: code = NotFound desc = could not find container \"7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88\": container with ID starting with 7379ac14c87898826abda5e79ef7d089396a096b81ae86a0cd8ad5065ea31f88 not found: ID does not exist" Mar 12 18:19:07 crc kubenswrapper[4926]: I0312 18:19:07.471483 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tn2nt"] Mar 12 18:19:08 crc kubenswrapper[4926]: I0312 18:19:08.500474 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" path="/var/lib/kubelet/pods/c3d1b5b0-e990-4051-884a-8b1bfb331618/volumes" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.056517 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-66f4595798-c276m"] Mar 12 18:19:13 crc kubenswrapper[4926]: E0312 18:19:13.057101 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="extract-utilities" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057115 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="extract-utilities" Mar 12 18:19:13 crc kubenswrapper[4926]: E0312 18:19:13.057127 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="util" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057134 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="util" Mar 12 18:19:13 crc kubenswrapper[4926]: E0312 18:19:13.057142 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="pull" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057150 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="pull" Mar 12 18:19:13 crc kubenswrapper[4926]: E0312 18:19:13.057166 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="extract-content" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057173 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="extract-content" Mar 12 18:19:13 crc kubenswrapper[4926]: E0312 18:19:13.057184 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="registry-server" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057191 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="registry-server" Mar 12 18:19:13 crc kubenswrapper[4926]: E0312 18:19:13.057209 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="extract" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057216 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="extract" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057342 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d1b5b0-e990-4051-884a-8b1bfb331618" containerName="registry-server" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057360 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a47463c-c539-4a5c-a3fd-c09feef2de67" containerName="extract" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.057870 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.060193 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-q8skp" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.080864 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66f4595798-c276m"] Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.117476 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km57g\" (UniqueName: \"kubernetes.io/projected/fe4ee666-a3c1-44c5-a07b-5ca8438e0482-kube-api-access-km57g\") pod \"openstack-operator-controller-init-66f4595798-c276m\" (UID: \"fe4ee666-a3c1-44c5-a07b-5ca8438e0482\") " pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.218818 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km57g\" (UniqueName: \"kubernetes.io/projected/fe4ee666-a3c1-44c5-a07b-5ca8438e0482-kube-api-access-km57g\") pod \"openstack-operator-controller-init-66f4595798-c276m\" (UID: \"fe4ee666-a3c1-44c5-a07b-5ca8438e0482\") " pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.240177 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km57g\" (UniqueName: \"kubernetes.io/projected/fe4ee666-a3c1-44c5-a07b-5ca8438e0482-kube-api-access-km57g\") pod \"openstack-operator-controller-init-66f4595798-c276m\" (UID: \"fe4ee666-a3c1-44c5-a07b-5ca8438e0482\") " pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.378186 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.615173 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66f4595798-c276m"] Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.620761 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.620989 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.668971 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.874241 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lkpwt"] Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.877259 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.882757 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkpwt"] Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.930823 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7sb\" (UniqueName: \"kubernetes.io/projected/5760df4b-1304-4183-8ab8-9fbfb9bf834e-kube-api-access-wp7sb\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.930955 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-catalog-content\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:13 crc kubenswrapper[4926]: I0312 18:19:13.931002 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-utilities\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.031526 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-catalog-content\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.031572 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-utilities\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.031627 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7sb\" (UniqueName: \"kubernetes.io/projected/5760df4b-1304-4183-8ab8-9fbfb9bf834e-kube-api-access-wp7sb\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.032285 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-utilities\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.032312 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-catalog-content\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.051145 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7sb\" (UniqueName: \"kubernetes.io/projected/5760df4b-1304-4183-8ab8-9fbfb9bf834e-kube-api-access-wp7sb\") pod \"redhat-marketplace-lkpwt\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.202466 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.398912 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkpwt"] Mar 12 18:19:14 crc kubenswrapper[4926]: W0312 18:19:14.414855 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5760df4b_1304_4183_8ab8_9fbfb9bf834e.slice/crio-e12189e32000defc9737b3f0fdd65ebb428033f60ea171fd2cf14de8505756de WatchSource:0}: Error finding container e12189e32000defc9737b3f0fdd65ebb428033f60ea171fd2cf14de8505756de: Status 404 returned error can't find the container with id e12189e32000defc9737b3f0fdd65ebb428033f60ea171fd2cf14de8505756de Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.448111 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkpwt" event={"ID":"5760df4b-1304-4183-8ab8-9fbfb9bf834e","Type":"ContainerStarted","Data":"e12189e32000defc9737b3f0fdd65ebb428033f60ea171fd2cf14de8505756de"} Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.449665 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" event={"ID":"fe4ee666-a3c1-44c5-a07b-5ca8438e0482","Type":"ContainerStarted","Data":"d532efc1ea779c2d60ac6c2534240645c07ed3d7a7cd4a1d3ecae9187aefb3b9"} Mar 12 18:19:14 crc kubenswrapper[4926]: I0312 18:19:14.502225 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:15 crc kubenswrapper[4926]: I0312 18:19:15.456980 4926 generic.go:334] "Generic (PLEG): container finished" podID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerID="ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb" exitCode=0 Mar 12 18:19:15 crc kubenswrapper[4926]: I0312 18:19:15.457081 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkpwt" event={"ID":"5760df4b-1304-4183-8ab8-9fbfb9bf834e","Type":"ContainerDied","Data":"ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb"} Mar 12 18:19:16 crc kubenswrapper[4926]: I0312 18:19:16.449140 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v54tn"] Mar 12 18:19:17 crc kubenswrapper[4926]: I0312 18:19:17.482089 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v54tn" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="registry-server" containerID="cri-o://2a931491103b2d031737572ba24dfcf764dd2ec0c44c2ffec92cc6d03c2bd8ba" gracePeriod=2 Mar 12 18:19:18 crc kubenswrapper[4926]: I0312 18:19:18.491053 4926 generic.go:334] "Generic (PLEG): container finished" podID="b1bc86da-e204-472f-9744-af11c0e8a265" containerID="2a931491103b2d031737572ba24dfcf764dd2ec0c44c2ffec92cc6d03c2bd8ba" exitCode=0 Mar 12 18:19:18 crc kubenswrapper[4926]: I0312 18:19:18.500886 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerDied","Data":"2a931491103b2d031737572ba24dfcf764dd2ec0c44c2ffec92cc6d03c2bd8ba"} Mar 12 18:19:19 crc kubenswrapper[4926]: I0312 18:19:19.905553 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.039799 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-catalog-content\") pod \"b1bc86da-e204-472f-9744-af11c0e8a265\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.040128 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-utilities\") pod \"b1bc86da-e204-472f-9744-af11c0e8a265\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.040274 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q26vh\" (UniqueName: \"kubernetes.io/projected/b1bc86da-e204-472f-9744-af11c0e8a265-kube-api-access-q26vh\") pod \"b1bc86da-e204-472f-9744-af11c0e8a265\" (UID: \"b1bc86da-e204-472f-9744-af11c0e8a265\") " Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.043499 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-utilities" (OuterVolumeSpecName: "utilities") pod "b1bc86da-e204-472f-9744-af11c0e8a265" (UID: "b1bc86da-e204-472f-9744-af11c0e8a265"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.047137 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1bc86da-e204-472f-9744-af11c0e8a265-kube-api-access-q26vh" (OuterVolumeSpecName: "kube-api-access-q26vh") pod "b1bc86da-e204-472f-9744-af11c0e8a265" (UID: "b1bc86da-e204-472f-9744-af11c0e8a265"). InnerVolumeSpecName "kube-api-access-q26vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.098270 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1bc86da-e204-472f-9744-af11c0e8a265" (UID: "b1bc86da-e204-472f-9744-af11c0e8a265"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.141916 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.141955 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1bc86da-e204-472f-9744-af11c0e8a265-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.141983 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q26vh\" (UniqueName: \"kubernetes.io/projected/b1bc86da-e204-472f-9744-af11c0e8a265-kube-api-access-q26vh\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.505210 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v54tn" event={"ID":"b1bc86da-e204-472f-9744-af11c0e8a265","Type":"ContainerDied","Data":"26d978bbd24802db4c024d84e6fb86f4ea3699958fdd595fabcd49191a4e9809"} Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.505284 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v54tn" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.505301 4926 scope.go:117] "RemoveContainer" containerID="2a931491103b2d031737572ba24dfcf764dd2ec0c44c2ffec92cc6d03c2bd8ba" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.507143 4926 generic.go:334] "Generic (PLEG): container finished" podID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerID="321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d" exitCode=0 Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.507246 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkpwt" event={"ID":"5760df4b-1304-4183-8ab8-9fbfb9bf834e","Type":"ContainerDied","Data":"321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d"} Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.510957 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" event={"ID":"fe4ee666-a3c1-44c5-a07b-5ca8438e0482","Type":"ContainerStarted","Data":"919f28d38b2ea6bc9cc52444fe8a3644659066c1d1bad126cd7ef0693ae5e8f1"} Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.512082 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.530327 4926 scope.go:117] "RemoveContainer" containerID="8f6bfe79045806d74f84405e9ff1f3a84c03533efa8e6773bb09b52e303ff4e2" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.543822 4926 scope.go:117] "RemoveContainer" containerID="88e86e609584800aecd00ec27240db80609a2d80782a4ff3069326cd7151ac45" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.563695 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" podStartSLOduration=1.285328016 podStartE2EDuration="7.563670999s" podCreationTimestamp="2026-03-12 18:19:13 +0000 UTC" firstStartedPulling="2026-03-12 18:19:13.626251352 +0000 UTC m=+993.994877685" lastFinishedPulling="2026-03-12 18:19:19.904594335 +0000 UTC m=+1000.273220668" observedRunningTime="2026-03-12 18:19:20.559143739 +0000 UTC m=+1000.927770072" watchObservedRunningTime="2026-03-12 18:19:20.563670999 +0000 UTC m=+1000.932297372" Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.590486 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v54tn"] Mar 12 18:19:20 crc kubenswrapper[4926]: I0312 18:19:20.594219 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v54tn"] Mar 12 18:19:21 crc kubenswrapper[4926]: I0312 18:19:21.521116 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkpwt" event={"ID":"5760df4b-1304-4183-8ab8-9fbfb9bf834e","Type":"ContainerStarted","Data":"8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171"} Mar 12 18:19:21 crc kubenswrapper[4926]: I0312 18:19:21.541956 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lkpwt" podStartSLOduration=3.507176673 podStartE2EDuration="8.541938166s" podCreationTimestamp="2026-03-12 18:19:13 +0000 UTC" firstStartedPulling="2026-03-12 18:19:16.098075235 +0000 UTC m=+996.466701568" lastFinishedPulling="2026-03-12 18:19:21.132836728 +0000 UTC m=+1001.501463061" observedRunningTime="2026-03-12 18:19:21.537359593 +0000 UTC m=+1001.905985916" watchObservedRunningTime="2026-03-12 18:19:21.541938166 +0000 UTC m=+1001.910564499" Mar 12 18:19:22 crc kubenswrapper[4926]: I0312 18:19:22.497653 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" path="/var/lib/kubelet/pods/b1bc86da-e204-472f-9744-af11c0e8a265/volumes" Mar 12 18:19:24 crc kubenswrapper[4926]: I0312 18:19:24.203290 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:24 crc kubenswrapper[4926]: I0312 18:19:24.204474 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:24 crc kubenswrapper[4926]: I0312 18:19:24.279814 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:33 crc kubenswrapper[4926]: I0312 18:19:33.382085 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-66f4595798-c276m" Mar 12 18:19:34 crc kubenswrapper[4926]: I0312 18:19:34.298077 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:34 crc kubenswrapper[4926]: I0312 18:19:34.338294 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkpwt"] Mar 12 18:19:34 crc kubenswrapper[4926]: I0312 18:19:34.612218 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lkpwt" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="registry-server" containerID="cri-o://8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171" gracePeriod=2 Mar 12 18:19:34 crc kubenswrapper[4926]: I0312 18:19:34.974884 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.090535 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-catalog-content\") pod \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.092707 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-utilities\") pod \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.092770 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp7sb\" (UniqueName: \"kubernetes.io/projected/5760df4b-1304-4183-8ab8-9fbfb9bf834e-kube-api-access-wp7sb\") pod \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\" (UID: \"5760df4b-1304-4183-8ab8-9fbfb9bf834e\") " Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.094549 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-utilities" (OuterVolumeSpecName: "utilities") pod "5760df4b-1304-4183-8ab8-9fbfb9bf834e" (UID: "5760df4b-1304-4183-8ab8-9fbfb9bf834e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.099268 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5760df4b-1304-4183-8ab8-9fbfb9bf834e-kube-api-access-wp7sb" (OuterVolumeSpecName: "kube-api-access-wp7sb") pod "5760df4b-1304-4183-8ab8-9fbfb9bf834e" (UID: "5760df4b-1304-4183-8ab8-9fbfb9bf834e"). InnerVolumeSpecName "kube-api-access-wp7sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.126288 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5760df4b-1304-4183-8ab8-9fbfb9bf834e" (UID: "5760df4b-1304-4183-8ab8-9fbfb9bf834e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.195194 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.195258 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp7sb\" (UniqueName: \"kubernetes.io/projected/5760df4b-1304-4183-8ab8-9fbfb9bf834e-kube-api-access-wp7sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.195282 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5760df4b-1304-4183-8ab8-9fbfb9bf834e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.623761 4926 generic.go:334] "Generic (PLEG): container finished" podID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerID="8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171" exitCode=0 Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.623834 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkpwt" event={"ID":"5760df4b-1304-4183-8ab8-9fbfb9bf834e","Type":"ContainerDied","Data":"8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171"} Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.623861 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkpwt" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.623898 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkpwt" event={"ID":"5760df4b-1304-4183-8ab8-9fbfb9bf834e","Type":"ContainerDied","Data":"e12189e32000defc9737b3f0fdd65ebb428033f60ea171fd2cf14de8505756de"} Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.623939 4926 scope.go:117] "RemoveContainer" containerID="8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.658744 4926 scope.go:117] "RemoveContainer" containerID="321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.662840 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkpwt"] Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.670317 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkpwt"] Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.695819 4926 scope.go:117] "RemoveContainer" containerID="ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.714523 4926 scope.go:117] "RemoveContainer" containerID="8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171" Mar 12 18:19:35 crc kubenswrapper[4926]: E0312 18:19:35.714851 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171\": container with ID starting with 8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171 not found: ID does not exist" containerID="8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.714904 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171"} err="failed to get container status \"8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171\": rpc error: code = NotFound desc = could not find container \"8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171\": container with ID starting with 8cfe4fe1a1488acebc531e1b4876a26cd083167fab9b1c38ab77343ef7244171 not found: ID does not exist" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.714946 4926 scope.go:117] "RemoveContainer" containerID="321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d" Mar 12 18:19:35 crc kubenswrapper[4926]: E0312 18:19:35.715323 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d\": container with ID starting with 321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d not found: ID does not exist" containerID="321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.715363 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d"} err="failed to get container status \"321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d\": rpc error: code = NotFound desc = could not find container \"321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d\": container with ID starting with 321acb9fc46bde9a355bab5d1b06a1220bab5d4dee54e97656cf70a21d113f6d not found: ID does not exist" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.715385 4926 scope.go:117] "RemoveContainer" containerID="ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb" Mar 12 18:19:35 crc kubenswrapper[4926]: E0312 18:19:35.715660 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb\": container with ID starting with ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb not found: ID does not exist" containerID="ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb" Mar 12 18:19:35 crc kubenswrapper[4926]: I0312 18:19:35.715689 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb"} err="failed to get container status \"ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb\": rpc error: code = NotFound desc = could not find container \"ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb\": container with ID starting with ed4aa3e456adb49e71017dd6c15eb8257568bdc74ddf99e569ac60e8d26b5ffb not found: ID does not exist" Mar 12 18:19:36 crc kubenswrapper[4926]: I0312 18:19:36.502697 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" path="/var/lib/kubelet/pods/5760df4b-1304-4183-8ab8-9fbfb9bf834e/volumes" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.134023 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555660-cbx5b"] Mar 12 18:20:00 crc kubenswrapper[4926]: E0312 18:20:00.135005 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="registry-server" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135024 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="registry-server" Mar 12 18:20:00 crc kubenswrapper[4926]: E0312 18:20:00.135043 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="extract-content" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135053 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="extract-content" Mar 12 18:20:00 crc kubenswrapper[4926]: E0312 18:20:00.135070 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="registry-server" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135081 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="registry-server" Mar 12 18:20:00 crc kubenswrapper[4926]: E0312 18:20:00.135121 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="extract-utilities" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135132 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="extract-utilities" Mar 12 18:20:00 crc kubenswrapper[4926]: E0312 18:20:00.135147 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="extract-content" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135159 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="extract-content" Mar 12 18:20:00 crc kubenswrapper[4926]: E0312 18:20:00.135178 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="extract-utilities" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135188 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="extract-utilities" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135355 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1bc86da-e204-472f-9744-af11c0e8a265" containerName="registry-server" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135371 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5760df4b-1304-4183-8ab8-9fbfb9bf834e" containerName="registry-server" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.135990 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.138293 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.138670 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.139180 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.145465 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555660-cbx5b"] Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.268455 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6s5b\" (UniqueName: \"kubernetes.io/projected/b50fb579-57d6-4029-a4f3-c8a3303bac4d-kube-api-access-f6s5b\") pod \"auto-csr-approver-29555660-cbx5b\" (UID: \"b50fb579-57d6-4029-a4f3-c8a3303bac4d\") " pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.370334 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6s5b\" (UniqueName: \"kubernetes.io/projected/b50fb579-57d6-4029-a4f3-c8a3303bac4d-kube-api-access-f6s5b\") pod \"auto-csr-approver-29555660-cbx5b\" (UID: \"b50fb579-57d6-4029-a4f3-c8a3303bac4d\") " pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.396271 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6s5b\" (UniqueName: \"kubernetes.io/projected/b50fb579-57d6-4029-a4f3-c8a3303bac4d-kube-api-access-f6s5b\") pod \"auto-csr-approver-29555660-cbx5b\" (UID: \"b50fb579-57d6-4029-a4f3-c8a3303bac4d\") " pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.453958 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.687580 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555660-cbx5b"] Mar 12 18:20:00 crc kubenswrapper[4926]: I0312 18:20:00.788232 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" event={"ID":"b50fb579-57d6-4029-a4f3-c8a3303bac4d","Type":"ContainerStarted","Data":"f263435b763cc115efbb77597e4b93b8d2c4a1f2f6ba06fd3f2f757eb1164e16"} Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.436323 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.437759 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.445935 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5lgf2" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.449162 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.450172 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.453666 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.458008 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.458330 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9x7p7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.458720 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.460119 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-j6zqr" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.476337 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.480924 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.486655 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.487554 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.491650 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-nwrjb" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.498678 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phfsx\" (UniqueName: \"kubernetes.io/projected/e7b3fba0-ddaa-4cef-9df6-0683a92475cf-kube-api-access-phfsx\") pod \"barbican-operator-controller-manager-677bd678f7-ssmx7\" (UID: \"e7b3fba0-ddaa-4cef-9df6-0683a92475cf\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.498771 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cqv\" (UniqueName: \"kubernetes.io/projected/f112cb87-7454-41fa-a1e1-381d79f86247-kube-api-access-85cqv\") pod \"cinder-operator-controller-manager-984cd4dcf-zlsrd\" (UID: \"f112cb87-7454-41fa-a1e1-381d79f86247\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.509213 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-525z5"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.514080 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.517920 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-xnnxs" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.546013 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.575772 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-525z5"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.587432 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.588357 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.591188 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tdvjg" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.599693 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2zjq\" (UniqueName: \"kubernetes.io/projected/4800992f-cfad-4a1c-94e5-79427f88c002-kube-api-access-n2zjq\") pod \"designate-operator-controller-manager-66d56f6ff4-clngf\" (UID: \"4800992f-cfad-4a1c-94e5-79427f88c002\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.599751 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phfsx\" (UniqueName: \"kubernetes.io/projected/e7b3fba0-ddaa-4cef-9df6-0683a92475cf-kube-api-access-phfsx\") pod \"barbican-operator-controller-manager-677bd678f7-ssmx7\" (UID: \"e7b3fba0-ddaa-4cef-9df6-0683a92475cf\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.599790 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d55vh\" (UniqueName: \"kubernetes.io/projected/fea71415-42ac-4e77-ba9c-25170ccece27-kube-api-access-d55vh\") pod \"heat-operator-controller-manager-77b6666d85-525z5\" (UID: \"fea71415-42ac-4e77-ba9c-25170ccece27\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.599904 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p26t\" (UniqueName: \"kubernetes.io/projected/b776be98-1352-43c6-8ee8-e31076b7d12b-kube-api-access-7p26t\") pod \"glance-operator-controller-manager-5964f64c48-m6sxh\" (UID: \"b776be98-1352-43c6-8ee8-e31076b7d12b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.599934 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cqv\" (UniqueName: \"kubernetes.io/projected/f112cb87-7454-41fa-a1e1-381d79f86247-kube-api-access-85cqv\") pod \"cinder-operator-controller-manager-984cd4dcf-zlsrd\" (UID: \"f112cb87-7454-41fa-a1e1-381d79f86247\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.634620 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.641806 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cqv\" (UniqueName: \"kubernetes.io/projected/f112cb87-7454-41fa-a1e1-381d79f86247-kube-api-access-85cqv\") pod \"cinder-operator-controller-manager-984cd4dcf-zlsrd\" (UID: \"f112cb87-7454-41fa-a1e1-381d79f86247\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.642520 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phfsx\" (UniqueName: \"kubernetes.io/projected/e7b3fba0-ddaa-4cef-9df6-0683a92475cf-kube-api-access-phfsx\") pod \"barbican-operator-controller-manager-677bd678f7-ssmx7\" (UID: \"e7b3fba0-ddaa-4cef-9df6-0683a92475cf\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.648592 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.649542 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.653733 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.655415 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-mqcdw" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.673755 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.674719 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.677210 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rbkds" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.681604 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.700931 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.701676 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p26t\" (UniqueName: \"kubernetes.io/projected/b776be98-1352-43c6-8ee8-e31076b7d12b-kube-api-access-7p26t\") pod \"glance-operator-controller-manager-5964f64c48-m6sxh\" (UID: \"b776be98-1352-43c6-8ee8-e31076b7d12b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.701726 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg25f\" (UniqueName: \"kubernetes.io/projected/065fe73a-651c-4cd3-b8d7-135617c51bbd-kube-api-access-fg25f\") pod \"horizon-operator-controller-manager-6d9d6b584d-p76mx\" (UID: \"065fe73a-651c-4cd3-b8d7-135617c51bbd\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.701762 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2zjq\" (UniqueName: \"kubernetes.io/projected/4800992f-cfad-4a1c-94e5-79427f88c002-kube-api-access-n2zjq\") pod \"designate-operator-controller-manager-66d56f6ff4-clngf\" (UID: \"4800992f-cfad-4a1c-94e5-79427f88c002\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.701791 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d55vh\" (UniqueName: \"kubernetes.io/projected/fea71415-42ac-4e77-ba9c-25170ccece27-kube-api-access-d55vh\") pod \"heat-operator-controller-manager-77b6666d85-525z5\" (UID: \"fea71415-42ac-4e77-ba9c-25170ccece27\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.701816 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.701834 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkzx\" (UniqueName: \"kubernetes.io/projected/fd35525c-7b73-49d1-a36c-c49d3bf933eb-kube-api-access-hwkzx\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.705048 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.715376 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.716692 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.723587 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hqcpj" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.723587 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.724739 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.740025 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p26t\" (UniqueName: \"kubernetes.io/projected/b776be98-1352-43c6-8ee8-e31076b7d12b-kube-api-access-7p26t\") pod \"glance-operator-controller-manager-5964f64c48-m6sxh\" (UID: \"b776be98-1352-43c6-8ee8-e31076b7d12b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.740398 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-cf7zh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.751056 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d55vh\" (UniqueName: \"kubernetes.io/projected/fea71415-42ac-4e77-ba9c-25170ccece27-kube-api-access-d55vh\") pod \"heat-operator-controller-manager-77b6666d85-525z5\" (UID: \"fea71415-42ac-4e77-ba9c-25170ccece27\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.753994 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2zjq\" (UniqueName: \"kubernetes.io/projected/4800992f-cfad-4a1c-94e5-79427f88c002-kube-api-access-n2zjq\") pod \"designate-operator-controller-manager-66d56f6ff4-clngf\" (UID: \"4800992f-cfad-4a1c-94e5-79427f88c002\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.764600 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.764941 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.784732 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.786511 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.787603 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.791731 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-bphvw" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.808789 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.808835 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkzx\" (UniqueName: \"kubernetes.io/projected/fd35525c-7b73-49d1-a36c-c49d3bf933eb-kube-api-access-hwkzx\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.808879 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pcf6\" (UniqueName: \"kubernetes.io/projected/c62baaa0-d72b-4240-9dba-858bdf61d1b3-kube-api-access-4pcf6\") pod \"keystone-operator-controller-manager-684f77d66d-5f6c4\" (UID: \"c62baaa0-d72b-4240-9dba-858bdf61d1b3\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.808895 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:02 crc kubenswrapper[4926]: E0312 18:20:02.809014 4926 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:02 crc kubenswrapper[4926]: E0312 18:20:02.809058 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert podName:fd35525c-7b73-49d1-a36c-c49d3bf933eb nodeName:}" failed. No retries permitted until 2026-03-12 18:20:03.30904156 +0000 UTC m=+1043.677667893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert") pod "infra-operator-controller-manager-5995f4446f-kzl7p" (UID: "fd35525c-7b73-49d1-a36c-c49d3bf933eb") : secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.808907 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9sjn\" (UniqueName: \"kubernetes.io/projected/a41da562-a119-4785-95d0-eaf0970a99f4-kube-api-access-d9sjn\") pod \"ironic-operator-controller-manager-6bbb499bbc-7q8xr\" (UID: \"a41da562-a119-4785-95d0-eaf0970a99f4\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.809315 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg25f\" (UniqueName: \"kubernetes.io/projected/065fe73a-651c-4cd3-b8d7-135617c51bbd-kube-api-access-fg25f\") pod \"horizon-operator-controller-manager-6d9d6b584d-p76mx\" (UID: \"065fe73a-651c-4cd3-b8d7-135617c51bbd\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.809374 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4g6t\" (UniqueName: \"kubernetes.io/projected/e38cf931-bbd6-4b47-bdf6-8a514d17d3d7-kube-api-access-n4g6t\") pod \"manila-operator-controller-manager-68f45f9d9f-f96w7\" (UID: \"e38cf931-bbd6-4b47-bdf6-8a514d17d3d7\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.831647 4926 generic.go:334] "Generic (PLEG): container finished" podID="b50fb579-57d6-4029-a4f3-c8a3303bac4d" containerID="6f2e2b241a59eb4ec9fee306fab9cd670167a7fcd068ac1d7f97bc699a2c0ee6" exitCode=0 Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.831695 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" event={"ID":"b50fb579-57d6-4029-a4f3-c8a3303bac4d","Type":"ContainerDied","Data":"6f2e2b241a59eb4ec9fee306fab9cd670167a7fcd068ac1d7f97bc699a2c0ee6"} Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.844227 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.846596 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.859362 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.860131 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.860519 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.866300 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wdqjw" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.868309 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.869234 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.872940 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.876963 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-x9sdm" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.883266 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.887395 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg25f\" (UniqueName: \"kubernetes.io/projected/065fe73a-651c-4cd3-b8d7-135617c51bbd-kube-api-access-fg25f\") pod \"horizon-operator-controller-manager-6d9d6b584d-p76mx\" (UID: \"065fe73a-651c-4cd3-b8d7-135617c51bbd\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.888370 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2fqgq" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.893079 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkzx\" (UniqueName: \"kubernetes.io/projected/fd35525c-7b73-49d1-a36c-c49d3bf933eb-kube-api-access-hwkzx\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.894615 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.904426 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.908902 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.913342 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.914282 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5wbk\" (UniqueName: \"kubernetes.io/projected/2e44c177-b87d-4ff6-80ca-672477fe9e94-kube-api-access-q5wbk\") pod \"nova-operator-controller-manager-569cc54c5-dzj5w\" (UID: \"2e44c177-b87d-4ff6-80ca-672477fe9e94\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.914324 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q249w\" (UniqueName: \"kubernetes.io/projected/ccd465e8-6811-4865-9602-3dea8144cc01-kube-api-access-q249w\") pod \"mariadb-operator-controller-manager-658d4cdd5-c58lh\" (UID: \"ccd465e8-6811-4865-9602-3dea8144cc01\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.914352 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4g6t\" (UniqueName: \"kubernetes.io/projected/e38cf931-bbd6-4b47-bdf6-8a514d17d3d7-kube-api-access-n4g6t\") pod \"manila-operator-controller-manager-68f45f9d9f-f96w7\" (UID: \"e38cf931-bbd6-4b47-bdf6-8a514d17d3d7\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.914481 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rslhc\" (UniqueName: \"kubernetes.io/projected/6374b396-00ef-4aca-ac07-fd46982f23f1-kube-api-access-rslhc\") pod \"neutron-operator-controller-manager-776c5696bf-njrjb\" (UID: \"6374b396-00ef-4aca-ac07-fd46982f23f1\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.914529 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pcf6\" (UniqueName: \"kubernetes.io/projected/c62baaa0-d72b-4240-9dba-858bdf61d1b3-kube-api-access-4pcf6\") pod \"keystone-operator-controller-manager-684f77d66d-5f6c4\" (UID: \"c62baaa0-d72b-4240-9dba-858bdf61d1b3\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.914568 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9sjn\" (UniqueName: \"kubernetes.io/projected/a41da562-a119-4785-95d0-eaf0970a99f4-kube-api-access-d9sjn\") pod \"ironic-operator-controller-manager-6bbb499bbc-7q8xr\" (UID: \"a41da562-a119-4785-95d0-eaf0970a99f4\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.917494 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.918602 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.921368 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-drhd2" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.930507 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.931546 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.935933 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-49m2z" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.940258 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.941204 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.955243 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.956685 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.956819 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.959602 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5jlln" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.959792 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.963214 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.966691 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4g6t\" (UniqueName: \"kubernetes.io/projected/e38cf931-bbd6-4b47-bdf6-8a514d17d3d7-kube-api-access-n4g6t\") pod \"manila-operator-controller-manager-68f45f9d9f-f96w7\" (UID: \"e38cf931-bbd6-4b47-bdf6-8a514d17d3d7\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.969740 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9sjn\" (UniqueName: \"kubernetes.io/projected/a41da562-a119-4785-95d0-eaf0970a99f4-kube-api-access-d9sjn\") pod \"ironic-operator-controller-manager-6bbb499bbc-7q8xr\" (UID: \"a41da562-a119-4785-95d0-eaf0970a99f4\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.970762 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747"] Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.979781 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gglcr" Mar 12 18:20:02 crc kubenswrapper[4926]: I0312 18:20:02.985292 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pcf6\" (UniqueName: \"kubernetes.io/projected/c62baaa0-d72b-4240-9dba-858bdf61d1b3-kube-api-access-4pcf6\") pod \"keystone-operator-controller-manager-684f77d66d-5f6c4\" (UID: \"c62baaa0-d72b-4240-9dba-858bdf61d1b3\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.002760 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.032517 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f48c2\" (UniqueName: \"kubernetes.io/projected/14346445-95be-488f-858c-44bf5b45c656-kube-api-access-f48c2\") pod \"ovn-operator-controller-manager-bbc5b68f9-xr747\" (UID: \"14346445-95be-488f-858c-44bf5b45c656\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.032613 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8wk\" (UniqueName: \"kubernetes.io/projected/e3baa344-9dd8-48ed-8b6a-60ff9fbc181a-kube-api-access-hg8wk\") pod \"placement-operator-controller-manager-574d45c66c-tpnlt\" (UID: \"e3baa344-9dd8-48ed-8b6a-60ff9fbc181a\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.032673 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rslhc\" (UniqueName: \"kubernetes.io/projected/6374b396-00ef-4aca-ac07-fd46982f23f1-kube-api-access-rslhc\") pod \"neutron-operator-controller-manager-776c5696bf-njrjb\" (UID: \"6374b396-00ef-4aca-ac07-fd46982f23f1\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.032784 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5wbk\" (UniqueName: \"kubernetes.io/projected/2e44c177-b87d-4ff6-80ca-672477fe9e94-kube-api-access-q5wbk\") pod \"nova-operator-controller-manager-569cc54c5-dzj5w\" (UID: \"2e44c177-b87d-4ff6-80ca-672477fe9e94\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.033081 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tk87\" (UniqueName: \"kubernetes.io/projected/309d4a2a-f738-4d2d-a28e-f361f762f997-kube-api-access-4tk87\") pod \"swift-operator-controller-manager-677c674df7-r2cxk\" (UID: \"309d4a2a-f738-4d2d-a28e-f361f762f997\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.033151 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q249w\" (UniqueName: \"kubernetes.io/projected/ccd465e8-6811-4865-9602-3dea8144cc01-kube-api-access-q249w\") pod \"mariadb-operator-controller-manager-658d4cdd5-c58lh\" (UID: \"ccd465e8-6811-4865-9602-3dea8144cc01\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.033184 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7j4b\" (UniqueName: \"kubernetes.io/projected/464725b8-2734-43a4-a232-5db9bafed311-kube-api-access-w7j4b\") pod \"octavia-operator-controller-manager-5f4f55cb5c-cv97b\" (UID: \"464725b8-2734-43a4-a232-5db9bafed311\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.033243 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.033284 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf95s\" (UniqueName: \"kubernetes.io/projected/5d4dea90-1696-4195-a0a0-71a3c9f3e328-kube-api-access-nf95s\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.047633 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.087702 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q249w\" (UniqueName: \"kubernetes.io/projected/ccd465e8-6811-4865-9602-3dea8144cc01-kube-api-access-q249w\") pod \"mariadb-operator-controller-manager-658d4cdd5-c58lh\" (UID: \"ccd465e8-6811-4865-9602-3dea8144cc01\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.091784 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rslhc\" (UniqueName: \"kubernetes.io/projected/6374b396-00ef-4aca-ac07-fd46982f23f1-kube-api-access-rslhc\") pod \"neutron-operator-controller-manager-776c5696bf-njrjb\" (UID: \"6374b396-00ef-4aca-ac07-fd46982f23f1\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.097316 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5wbk\" (UniqueName: \"kubernetes.io/projected/2e44c177-b87d-4ff6-80ca-672477fe9e94-kube-api-access-q5wbk\") pod \"nova-operator-controller-manager-569cc54c5-dzj5w\" (UID: \"2e44c177-b87d-4ff6-80ca-672477fe9e94\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.101590 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.103915 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.118569 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hwv2s" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.122897 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.135045 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tk87\" (UniqueName: \"kubernetes.io/projected/309d4a2a-f738-4d2d-a28e-f361f762f997-kube-api-access-4tk87\") pod \"swift-operator-controller-manager-677c674df7-r2cxk\" (UID: \"309d4a2a-f738-4d2d-a28e-f361f762f997\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.136263 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7j4b\" (UniqueName: \"kubernetes.io/projected/464725b8-2734-43a4-a232-5db9bafed311-kube-api-access-w7j4b\") pod \"octavia-operator-controller-manager-5f4f55cb5c-cv97b\" (UID: \"464725b8-2734-43a4-a232-5db9bafed311\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.136347 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.136385 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf95s\" (UniqueName: \"kubernetes.io/projected/5d4dea90-1696-4195-a0a0-71a3c9f3e328-kube-api-access-nf95s\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.136500 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f48c2\" (UniqueName: \"kubernetes.io/projected/14346445-95be-488f-858c-44bf5b45c656-kube-api-access-f48c2\") pod \"ovn-operator-controller-manager-bbc5b68f9-xr747\" (UID: \"14346445-95be-488f-858c-44bf5b45c656\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.136565 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8wk\" (UniqueName: \"kubernetes.io/projected/e3baa344-9dd8-48ed-8b6a-60ff9fbc181a-kube-api-access-hg8wk\") pod \"placement-operator-controller-manager-574d45c66c-tpnlt\" (UID: \"e3baa344-9dd8-48ed-8b6a-60ff9fbc181a\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.137715 4926 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.137809 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert podName:5d4dea90-1696-4195-a0a0-71a3c9f3e328 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:03.637756757 +0000 UTC m=+1044.006383090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" (UID: "5d4dea90-1696-4195-a0a0-71a3c9f3e328") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.154673 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.159424 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.173641 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7j4b\" (UniqueName: \"kubernetes.io/projected/464725b8-2734-43a4-a232-5db9bafed311-kube-api-access-w7j4b\") pod \"octavia-operator-controller-manager-5f4f55cb5c-cv97b\" (UID: \"464725b8-2734-43a4-a232-5db9bafed311\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.176892 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f48c2\" (UniqueName: \"kubernetes.io/projected/14346445-95be-488f-858c-44bf5b45c656-kube-api-access-f48c2\") pod \"ovn-operator-controller-manager-bbc5b68f9-xr747\" (UID: \"14346445-95be-488f-858c-44bf5b45c656\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.177040 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tk87\" (UniqueName: \"kubernetes.io/projected/309d4a2a-f738-4d2d-a28e-f361f762f997-kube-api-access-4tk87\") pod \"swift-operator-controller-manager-677c674df7-r2cxk\" (UID: \"309d4a2a-f738-4d2d-a28e-f361f762f997\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.177490 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8wk\" (UniqueName: \"kubernetes.io/projected/e3baa344-9dd8-48ed-8b6a-60ff9fbc181a-kube-api-access-hg8wk\") pod \"placement-operator-controller-manager-574d45c66c-tpnlt\" (UID: \"e3baa344-9dd8-48ed-8b6a-60ff9fbc181a\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.177628 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf95s\" (UniqueName: \"kubernetes.io/projected/5d4dea90-1696-4195-a0a0-71a3c9f3e328-kube-api-access-nf95s\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.189537 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.190376 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.193322 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2tf46" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.193685 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.212644 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.231956 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.235765 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.236916 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.242373 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmp4h\" (UniqueName: \"kubernetes.io/projected/45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd-kube-api-access-fmp4h\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-z87mn\" (UID: \"45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.248775 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-sm27l" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.252039 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.256355 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.275084 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.322376 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.330692 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.331202 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.331304 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.333588 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.334073 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6t2jw" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.338535 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.343398 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.343463 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjm58\" (UniqueName: \"kubernetes.io/projected/682f3a0f-7437-455c-99e1-8b7cdb03328a-kube-api-access-pjm58\") pod \"test-operator-controller-manager-5c5cb9c4d7-25jrt\" (UID: \"682f3a0f-7437-455c-99e1-8b7cdb03328a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.343487 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6s7t\" (UniqueName: \"kubernetes.io/projected/80f26a3f-ab1e-49b1-8843-6674d948a5cd-kube-api-access-d6s7t\") pod \"watcher-operator-controller-manager-6dd88c6f67-cn7fq\" (UID: \"80f26a3f-ab1e-49b1-8843-6674d948a5cd\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.343512 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmp4h\" (UniqueName: \"kubernetes.io/projected/45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd-kube-api-access-fmp4h\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-z87mn\" (UID: \"45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.343648 4926 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.343702 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert podName:fd35525c-7b73-49d1-a36c-c49d3bf933eb nodeName:}" failed. No retries permitted until 2026-03-12 18:20:04.343685984 +0000 UTC m=+1044.712312307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert") pod "infra-operator-controller-manager-5995f4446f-kzl7p" (UID: "fd35525c-7b73-49d1-a36c-c49d3bf933eb") : secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.358728 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.360066 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.361035 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.365706 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-2q2p5" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.365929 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.372986 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmp4h\" (UniqueName: \"kubernetes.io/projected/45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd-kube-api-access-fmp4h\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-z87mn\" (UID: \"45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.406680 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.444240 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjm58\" (UniqueName: \"kubernetes.io/projected/682f3a0f-7437-455c-99e1-8b7cdb03328a-kube-api-access-pjm58\") pod \"test-operator-controller-manager-5c5cb9c4d7-25jrt\" (UID: \"682f3a0f-7437-455c-99e1-8b7cdb03328a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.444280 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6s7t\" (UniqueName: \"kubernetes.io/projected/80f26a3f-ab1e-49b1-8843-6674d948a5cd-kube-api-access-d6s7t\") pod \"watcher-operator-controller-manager-6dd88c6f67-cn7fq\" (UID: \"80f26a3f-ab1e-49b1-8843-6674d948a5cd\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.444337 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdhjg\" (UniqueName: \"kubernetes.io/projected/fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b-kube-api-access-fdhjg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wqzk9\" (UID: \"fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.444410 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.445045 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q6pl\" (UniqueName: \"kubernetes.io/projected/495cec8d-a262-4d55-9ee5-6eebb10b6765-kube-api-access-6q6pl\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.445072 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.470972 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjm58\" (UniqueName: \"kubernetes.io/projected/682f3a0f-7437-455c-99e1-8b7cdb03328a-kube-api-access-pjm58\") pod \"test-operator-controller-manager-5c5cb9c4d7-25jrt\" (UID: \"682f3a0f-7437-455c-99e1-8b7cdb03328a\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.473983 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6s7t\" (UniqueName: \"kubernetes.io/projected/80f26a3f-ab1e-49b1-8843-6674d948a5cd-kube-api-access-d6s7t\") pod \"watcher-operator-controller-manager-6dd88c6f67-cn7fq\" (UID: \"80f26a3f-ab1e-49b1-8843-6674d948a5cd\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.546515 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdhjg\" (UniqueName: \"kubernetes.io/projected/fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b-kube-api-access-fdhjg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wqzk9\" (UID: \"fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.546802 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.546905 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q6pl\" (UniqueName: \"kubernetes.io/projected/495cec8d-a262-4d55-9ee5-6eebb10b6765-kube-api-access-6q6pl\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.546985 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.547169 4926 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.547267 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:04.047252658 +0000 UTC m=+1044.415878991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.547592 4926 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.547698 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:04.04768845 +0000 UTC m=+1044.416314783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "metrics-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.569352 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q6pl\" (UniqueName: \"kubernetes.io/projected/495cec8d-a262-4d55-9ee5-6eebb10b6765-kube-api-access-6q6pl\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.570827 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdhjg\" (UniqueName: \"kubernetes.io/projected/fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b-kube-api-access-fdhjg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wqzk9\" (UID: \"fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.577135 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.636918 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.648367 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.648592 4926 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: E0312 18:20:03.649650 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert podName:5d4dea90-1696-4195-a0a0-71a3c9f3e328 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:04.649619642 +0000 UTC m=+1045.018246055 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" (UID: "5d4dea90-1696-4195-a0a0-71a3c9f3e328") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.657171 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.693993 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.706902 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.738822 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf"] Mar 12 18:20:03 crc kubenswrapper[4926]: I0312 18:20:03.933348 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.055632 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.055994 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.056141 4926 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.056209 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:05.056186822 +0000 UTC m=+1045.424813155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "webhook-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.056522 4926 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.056564 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:05.056551783 +0000 UTC m=+1045.425178116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "metrics-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.308406 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx"] Mar 12 18:20:04 crc kubenswrapper[4926]: W0312 18:20:04.322782 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065fe73a_651c_4cd3_b8d7_135617c51bbd.slice/crio-769059b055d102ba84effcc25a0d3adf60d1bd5e73e018efdcfab3e6a8f50679 WatchSource:0}: Error finding container 769059b055d102ba84effcc25a0d3adf60d1bd5e73e018efdcfab3e6a8f50679: Status 404 returned error can't find the container with id 769059b055d102ba84effcc25a0d3adf60d1bd5e73e018efdcfab3e6a8f50679 Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.328965 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.362118 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.362396 4926 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.362473 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert podName:fd35525c-7b73-49d1-a36c-c49d3bf933eb nodeName:}" failed. No retries permitted until 2026-03-12 18:20:06.362432629 +0000 UTC m=+1046.731058962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert") pod "infra-operator-controller-manager-5995f4446f-kzl7p" (UID: "fd35525c-7b73-49d1-a36c-c49d3bf933eb") : secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.362756 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-525z5"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.545698 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.667979 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6s5b\" (UniqueName: \"kubernetes.io/projected/b50fb579-57d6-4029-a4f3-c8a3303bac4d-kube-api-access-f6s5b\") pod \"b50fb579-57d6-4029-a4f3-c8a3303bac4d\" (UID: \"b50fb579-57d6-4029-a4f3-c8a3303bac4d\") " Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.668304 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.668594 4926 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: E0312 18:20:04.668693 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert podName:5d4dea90-1696-4195-a0a0-71a3c9f3e328 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:06.668668837 +0000 UTC m=+1047.037295210 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" (UID: "5d4dea90-1696-4195-a0a0-71a3c9f3e328") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.687319 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50fb579-57d6-4029-a4f3-c8a3303bac4d-kube-api-access-f6s5b" (OuterVolumeSpecName: "kube-api-access-f6s5b") pod "b50fb579-57d6-4029-a4f3-c8a3303bac4d" (UID: "b50fb579-57d6-4029-a4f3-c8a3303bac4d"). InnerVolumeSpecName "kube-api-access-f6s5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.722970 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.729865 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.746634 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.770060 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6s5b\" (UniqueName: \"kubernetes.io/projected/b50fb579-57d6-4029-a4f3-c8a3303bac4d-kube-api-access-f6s5b\") on node \"crc\" DevicePath \"\"" Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.773063 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.792856 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.803039 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.875229 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" event={"ID":"b776be98-1352-43c6-8ee8-e31076b7d12b","Type":"ContainerStarted","Data":"3d8da108de485eedef583392ee33111fceb0e719a5535289dc724da90b0da38d"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.877456 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" event={"ID":"fea71415-42ac-4e77-ba9c-25170ccece27","Type":"ContainerStarted","Data":"0ac5f9503dca95dd3000648533056ba61503a5029e4d1cdeddcb1dde4d15f5be"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.878636 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" event={"ID":"4800992f-cfad-4a1c-94e5-79427f88c002","Type":"ContainerStarted","Data":"8b66ab767d4f939f6e4072b918286c560db1a63ee14df30b0cfacbef55a2a510"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.882290 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" event={"ID":"b50fb579-57d6-4029-a4f3-c8a3303bac4d","Type":"ContainerDied","Data":"f263435b763cc115efbb77597e4b93b8d2c4a1f2f6ba06fd3f2f757eb1164e16"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.882322 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f263435b763cc115efbb77597e4b93b8d2c4a1f2f6ba06fd3f2f757eb1164e16" Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.882367 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555660-cbx5b" Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.893600 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" event={"ID":"14346445-95be-488f-858c-44bf5b45c656","Type":"ContainerStarted","Data":"6cb8eda1b70336068b0d2ce8120252d67265b7566856a8302132942ab925b06d"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.897050 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" event={"ID":"065fe73a-651c-4cd3-b8d7-135617c51bbd","Type":"ContainerStarted","Data":"769059b055d102ba84effcc25a0d3adf60d1bd5e73e018efdcfab3e6a8f50679"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.898376 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" event={"ID":"c62baaa0-d72b-4240-9dba-858bdf61d1b3","Type":"ContainerStarted","Data":"364c57258777706c1f0a24919ec22ef6537c98d949064c3415e910cc8c29cd6c"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.899405 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" event={"ID":"f112cb87-7454-41fa-a1e1-381d79f86247","Type":"ContainerStarted","Data":"51e730ea5972d666a55bf8d4c459c11da26f5c9bac76f90c3683958639006898"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.900781 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" event={"ID":"e38cf931-bbd6-4b47-bdf6-8a514d17d3d7","Type":"ContainerStarted","Data":"44f318f32ee118844455e111852282b7cf9d2b00c171ca1de24ab41a19fa9fe3"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.904854 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" event={"ID":"a41da562-a119-4785-95d0-eaf0970a99f4","Type":"ContainerStarted","Data":"ec458530e7483a1fa11b051ad6e9776ee400c828e8cc57d04fbddf74f8415400"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.907836 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" event={"ID":"e7b3fba0-ddaa-4cef-9df6-0683a92475cf","Type":"ContainerStarted","Data":"e5317c600ec77fbdf355fbc5e88cec64b1b8c074a90201367a8a16a527007d4b"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.909180 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" event={"ID":"464725b8-2734-43a4-a232-5db9bafed311","Type":"ContainerStarted","Data":"ed0dfb9c9bd787bf5584b3bf7cecc5cd59350cfffef8eddebbf70c4c9c22fd98"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.910606 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" event={"ID":"e3baa344-9dd8-48ed-8b6a-60ff9fbc181a","Type":"ContainerStarted","Data":"a61437a3a4f883b3927166fd4ef0d47de54e75d5f2abcd145622b9256b247c27"} Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.973078 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.982237 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.988764 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn"] Mar 12 18:20:04 crc kubenswrapper[4926]: I0312 18:20:04.995835 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb"] Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.001468 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq"] Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.008760 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt"] Mar 12 18:20:05 crc kubenswrapper[4926]: W0312 18:20:05.010138 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod682f3a0f_7437_455c_99e1_8b7cdb03328a.slice/crio-335cc127803561e2b305c4cec2665d2eebd4d9255df56259a72ffb8b0f0af07b WatchSource:0}: Error finding container 335cc127803561e2b305c4cec2665d2eebd4d9255df56259a72ffb8b0f0af07b: Status 404 returned error can't find the container with id 335cc127803561e2b305c4cec2665d2eebd4d9255df56259a72ffb8b0f0af07b Mar 12 18:20:05 crc kubenswrapper[4926]: W0312 18:20:05.010669 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6165e6_3d8f_4ddd_b6ae_a1307f2c6b3b.slice/crio-5ef16cf6580abee89b6c12f326482775078e8159285ebfe8f969b5da392be107 WatchSource:0}: Error finding container 5ef16cf6580abee89b6c12f326482775078e8159285ebfe8f969b5da392be107: Status 404 returned error can't find the container with id 5ef16cf6580abee89b6c12f326482775078e8159285ebfe8f969b5da392be107 Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.010956 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w"] Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.015120 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh"] Mar 12 18:20:05 crc kubenswrapper[4926]: W0312 18:20:05.016131 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f26a3f_ab1e_49b1_8843_6674d948a5cd.slice/crio-7fbaa0f112fbfa5bb70bb72a413114ef56396c426f8b3faacf7d85355d570657 WatchSource:0}: Error finding container 7fbaa0f112fbfa5bb70bb72a413114ef56396c426f8b3faacf7d85355d570657: Status 404 returned error can't find the container with id 7fbaa0f112fbfa5bb70bb72a413114ef56396c426f8b3faacf7d85355d570657 Mar 12 18:20:05 crc kubenswrapper[4926]: W0312 18:20:05.018779 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309d4a2a_f738_4d2d_a28e_f361f762f997.slice/crio-2e657873af44236a401f737091b0b44d128edeb0486cf0aa500b55a6753caf8e WatchSource:0}: Error finding container 2e657873af44236a401f737091b0b44d128edeb0486cf0aa500b55a6753caf8e: Status 404 returned error can't find the container with id 2e657873af44236a401f737091b0b44d128edeb0486cf0aa500b55a6753caf8e Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.019387 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d6s7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-cn7fq_openstack-operators(80f26a3f-ab1e-49b1-8843-6674d948a5cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.020839 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" podUID="80f26a3f-ab1e-49b1-8843-6674d948a5cd" Mar 12 18:20:05 crc kubenswrapper[4926]: W0312 18:20:05.034534 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6374b396_00ef_4aca_ac07_fd46982f23f1.slice/crio-95746e7b99ab105d593ac84930ec3c380ba9152103a98c99c3a6a6f2bd8f80c3 WatchSource:0}: Error finding container 95746e7b99ab105d593ac84930ec3c380ba9152103a98c99c3a6a6f2bd8f80c3: Status 404 returned error can't find the container with id 95746e7b99ab105d593ac84930ec3c380ba9152103a98c99c3a6a6f2bd8f80c3 Mar 12 18:20:05 crc kubenswrapper[4926]: W0312 18:20:05.042941 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd465e8_6811_4865_9602_3dea8144cc01.slice/crio-c9cb830c25538d7dd733478498e753da1234bed8d1846481094737779bdc41d0 WatchSource:0}: Error finding container c9cb830c25538d7dd733478498e753da1234bed8d1846481094737779bdc41d0: Status 404 returned error can't find the container with id c9cb830c25538d7dd733478498e753da1234bed8d1846481094737779bdc41d0 Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.043081 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4tk87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-r2cxk_openstack-operators(309d4a2a-f738-4d2d-a28e-f361f762f997): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.044222 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q5wbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-dzj5w_openstack-operators(2e44c177-b87d-4ff6-80ca-672477fe9e94): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.044373 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rslhc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-njrjb_openstack-operators(6374b396-00ef-4aca-ac07-fd46982f23f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.044460 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" podUID="309d4a2a-f738-4d2d-a28e-f361f762f997" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.046043 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" podUID="6374b396-00ef-4aca-ac07-fd46982f23f1" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.046096 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" podUID="2e44c177-b87d-4ff6-80ca-672477fe9e94" Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.074939 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.074996 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.075140 4926 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.075198 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:07.075184765 +0000 UTC m=+1047.443811098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "webhook-server-cert" not found Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.075224 4926 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.075287 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:07.075270988 +0000 UTC m=+1047.443897321 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "metrics-server-cert" not found Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.618530 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555654-jjvcf"] Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.627150 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555654-jjvcf"] Mar 12 18:20:05 crc kubenswrapper[4926]: I0312 18:20:05.953624 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" event={"ID":"2e44c177-b87d-4ff6-80ca-672477fe9e94","Type":"ContainerStarted","Data":"3cc4bf1aa3b6fb6ba3ceb48192827b5a35da1ef79262d29fcc2f5c84013b48fa"} Mar 12 18:20:05 crc kubenswrapper[4926]: E0312 18:20:05.963609 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" podUID="2e44c177-b87d-4ff6-80ca-672477fe9e94" Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.000216 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" event={"ID":"45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd","Type":"ContainerStarted","Data":"686314c6445feb8e08cad680a4dd637dfe9e9d1abfe8ec01f297b4f3f04b5879"} Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.043425 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" event={"ID":"6374b396-00ef-4aca-ac07-fd46982f23f1","Type":"ContainerStarted","Data":"95746e7b99ab105d593ac84930ec3c380ba9152103a98c99c3a6a6f2bd8f80c3"} Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.044890 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" podUID="6374b396-00ef-4aca-ac07-fd46982f23f1" Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.046039 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" event={"ID":"309d4a2a-f738-4d2d-a28e-f361f762f997","Type":"ContainerStarted","Data":"2e657873af44236a401f737091b0b44d128edeb0486cf0aa500b55a6753caf8e"} Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.051647 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" podUID="309d4a2a-f738-4d2d-a28e-f361f762f997" Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.052391 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" event={"ID":"80f26a3f-ab1e-49b1-8843-6674d948a5cd","Type":"ContainerStarted","Data":"7fbaa0f112fbfa5bb70bb72a413114ef56396c426f8b3faacf7d85355d570657"} Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.053562 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" podUID="80f26a3f-ab1e-49b1-8843-6674d948a5cd" Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.057283 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" event={"ID":"ccd465e8-6811-4865-9602-3dea8144cc01","Type":"ContainerStarted","Data":"c9cb830c25538d7dd733478498e753da1234bed8d1846481094737779bdc41d0"} Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.058781 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" event={"ID":"682f3a0f-7437-455c-99e1-8b7cdb03328a","Type":"ContainerStarted","Data":"335cc127803561e2b305c4cec2665d2eebd4d9255df56259a72ffb8b0f0af07b"} Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.060007 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" event={"ID":"fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b","Type":"ContainerStarted","Data":"5ef16cf6580abee89b6c12f326482775078e8159285ebfe8f969b5da392be107"} Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.401093 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.401337 4926 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.401411 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert podName:fd35525c-7b73-49d1-a36c-c49d3bf933eb nodeName:}" failed. No retries permitted until 2026-03-12 18:20:10.401395886 +0000 UTC m=+1050.770022219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert") pod "infra-operator-controller-manager-5995f4446f-kzl7p" (UID: "fd35525c-7b73-49d1-a36c-c49d3bf933eb") : secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.509037 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e47fc3-fa77-4992-b1f8-dfff8b4d924e" path="/var/lib/kubelet/pods/b3e47fc3-fa77-4992-b1f8-dfff8b4d924e/volumes" Mar 12 18:20:06 crc kubenswrapper[4926]: I0312 18:20:06.714181 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.714473 4926 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:06 crc kubenswrapper[4926]: E0312 18:20:06.714768 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert podName:5d4dea90-1696-4195-a0a0-71a3c9f3e328 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:10.714693123 +0000 UTC m=+1051.083319456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" (UID: "5d4dea90-1696-4195-a0a0-71a3c9f3e328") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.076711 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" podUID="80f26a3f-ab1e-49b1-8843-6674d948a5cd" Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.076725 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" podUID="2e44c177-b87d-4ff6-80ca-672477fe9e94" Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.076788 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" podUID="6374b396-00ef-4aca-ac07-fd46982f23f1" Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.076851 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" podUID="309d4a2a-f738-4d2d-a28e-f361f762f997" Mar 12 18:20:07 crc kubenswrapper[4926]: I0312 18:20:07.120669 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:07 crc kubenswrapper[4926]: I0312 18:20:07.120804 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.121646 4926 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.121688 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:11.121675235 +0000 UTC m=+1051.490301568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "metrics-server-cert" not found Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.121724 4926 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 18:20:07 crc kubenswrapper[4926]: E0312 18:20:07.121743 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:11.121737327 +0000 UTC m=+1051.490363660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "webhook-server-cert" not found Mar 12 18:20:10 crc kubenswrapper[4926]: I0312 18:20:10.481460 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:10 crc kubenswrapper[4926]: E0312 18:20:10.481689 4926 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:10 crc kubenswrapper[4926]: E0312 18:20:10.481925 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert podName:fd35525c-7b73-49d1-a36c-c49d3bf933eb nodeName:}" failed. No retries permitted until 2026-03-12 18:20:18.481907359 +0000 UTC m=+1058.850533692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert") pod "infra-operator-controller-manager-5995f4446f-kzl7p" (UID: "fd35525c-7b73-49d1-a36c-c49d3bf933eb") : secret "infra-operator-webhook-server-cert" not found Mar 12 18:20:10 crc kubenswrapper[4926]: I0312 18:20:10.786403 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:10 crc kubenswrapper[4926]: E0312 18:20:10.786592 4926 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:10 crc kubenswrapper[4926]: E0312 18:20:10.786668 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert podName:5d4dea90-1696-4195-a0a0-71a3c9f3e328 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:18.78665023 +0000 UTC m=+1059.155276563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" (UID: "5d4dea90-1696-4195-a0a0-71a3c9f3e328") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 18:20:11 crc kubenswrapper[4926]: I0312 18:20:11.190838 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:11 crc kubenswrapper[4926]: I0312 18:20:11.190894 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:11 crc kubenswrapper[4926]: E0312 18:20:11.191010 4926 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 18:20:11 crc kubenswrapper[4926]: E0312 18:20:11.191056 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:19.191043171 +0000 UTC m=+1059.559669504 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "webhook-server-cert" not found Mar 12 18:20:11 crc kubenswrapper[4926]: E0312 18:20:11.191421 4926 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 18:20:11 crc kubenswrapper[4926]: E0312 18:20:11.191474 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs podName:495cec8d-a262-4d55-9ee5-6eebb10b6765 nodeName:}" failed. No retries permitted until 2026-03-12 18:20:19.191463475 +0000 UTC m=+1059.560089818 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs") pod "openstack-operator-controller-manager-55976db558-2kgv6" (UID: "495cec8d-a262-4d55-9ee5-6eebb10b6765") : secret "metrics-server-cert" not found Mar 12 18:20:17 crc kubenswrapper[4926]: E0312 18:20:17.726573 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 12 18:20:17 crc kubenswrapper[4926]: E0312 18:20:17.727389 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w7j4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-cv97b_openstack-operators(464725b8-2734-43a4-a232-5db9bafed311): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:20:17 crc kubenswrapper[4926]: E0312 18:20:17.728578 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" podUID="464725b8-2734-43a4-a232-5db9bafed311" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.166753 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" podUID="464725b8-2734-43a4-a232-5db9bafed311" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.304576 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.304795 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pjm58,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-25jrt_openstack-operators(682f3a0f-7437-455c-99e1-8b7cdb03328a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.306104 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" podUID="682f3a0f-7437-455c-99e1-8b7cdb03328a" Mar 12 18:20:18 crc kubenswrapper[4926]: I0312 18:20:18.503341 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:18 crc kubenswrapper[4926]: I0312 18:20:18.510971 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd35525c-7b73-49d1-a36c-c49d3bf933eb-cert\") pod \"infra-operator-controller-manager-5995f4446f-kzl7p\" (UID: \"fd35525c-7b73-49d1-a36c-c49d3bf933eb\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:18 crc kubenswrapper[4926]: I0312 18:20:18.620356 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:18 crc kubenswrapper[4926]: I0312 18:20:18.811348 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:18 crc kubenswrapper[4926]: I0312 18:20:18.825141 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d4dea90-1696-4195-a0a0-71a3c9f3e328-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk\" (UID: \"5d4dea90-1696-4195-a0a0-71a3c9f3e328\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:18 crc kubenswrapper[4926]: I0312 18:20:18.832262 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.989053 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.989222 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d9sjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6bbb499bbc-7q8xr_openstack-operators(a41da562-a119-4785-95d0-eaf0970a99f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:20:18 crc kubenswrapper[4926]: E0312 18:20:18.991364 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" podUID="a41da562-a119-4785-95d0-eaf0970a99f4" Mar 12 18:20:19 crc kubenswrapper[4926]: E0312 18:20:19.171767 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" podUID="a41da562-a119-4785-95d0-eaf0970a99f4" Mar 12 18:20:19 crc kubenswrapper[4926]: E0312 18:20:19.171859 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" podUID="682f3a0f-7437-455c-99e1-8b7cdb03328a" Mar 12 18:20:19 crc kubenswrapper[4926]: I0312 18:20:19.217013 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:19 crc kubenswrapper[4926]: I0312 18:20:19.217170 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:19 crc kubenswrapper[4926]: I0312 18:20:19.220924 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-metrics-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:19 crc kubenswrapper[4926]: I0312 18:20:19.221002 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/495cec8d-a262-4d55-9ee5-6eebb10b6765-webhook-certs\") pod \"openstack-operator-controller-manager-55976db558-2kgv6\" (UID: \"495cec8d-a262-4d55-9ee5-6eebb10b6765\") " pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:19 crc kubenswrapper[4926]: I0312 18:20:19.272318 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:20 crc kubenswrapper[4926]: E0312 18:20:20.487117 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 12 18:20:20 crc kubenswrapper[4926]: E0312 18:20:20.487623 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fdhjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wqzk9_openstack-operators(fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:20:20 crc kubenswrapper[4926]: E0312 18:20:20.489686 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" podUID="fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b" Mar 12 18:20:21 crc kubenswrapper[4926]: E0312 18:20:21.061134 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 12 18:20:21 crc kubenswrapper[4926]: E0312 18:20:21.061377 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4pcf6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-5f6c4_openstack-operators(c62baaa0-d72b-4240-9dba-858bdf61d1b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:20:21 crc kubenswrapper[4926]: E0312 18:20:21.063001 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" podUID="c62baaa0-d72b-4240-9dba-858bdf61d1b3" Mar 12 18:20:21 crc kubenswrapper[4926]: E0312 18:20:21.182573 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" podUID="c62baaa0-d72b-4240-9dba-858bdf61d1b3" Mar 12 18:20:21 crc kubenswrapper[4926]: E0312 18:20:21.182778 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" podUID="fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b" Mar 12 18:20:21 crc kubenswrapper[4926]: I0312 18:20:21.956777 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk"] Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.001174 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6"] Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.048105 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p"] Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.194531 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" event={"ID":"e38cf931-bbd6-4b47-bdf6-8a514d17d3d7","Type":"ContainerStarted","Data":"828c5c775ada2fe5f95234cafd67fc4668597e16ad44e77f67724f9598d381bf"} Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.194641 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.198499 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" event={"ID":"e7b3fba0-ddaa-4cef-9df6-0683a92475cf","Type":"ContainerStarted","Data":"defe8e630055cea1ed7e76af9b7480ffc04f09076c35545b6298469cfafe3415"} Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.198650 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.200408 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" event={"ID":"065fe73a-651c-4cd3-b8d7-135617c51bbd","Type":"ContainerStarted","Data":"1522e86e37c89182735cd000418f29a0cbe10a5e5e870d136dd2747a3dc542ef"} Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.200493 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.202503 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" event={"ID":"e3baa344-9dd8-48ed-8b6a-60ff9fbc181a","Type":"ContainerStarted","Data":"6cb7e85519441ebc424d3fc21a8e17cb17c9bb6c61b7f2a163ccc016db5dbf47"} Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.202627 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.213504 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" podStartSLOduration=4.004369284 podStartE2EDuration="20.213484733s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.827990784 +0000 UTC m=+1045.196617117" lastFinishedPulling="2026-03-12 18:20:21.037106233 +0000 UTC m=+1061.405732566" observedRunningTime="2026-03-12 18:20:22.213314727 +0000 UTC m=+1062.581941060" watchObservedRunningTime="2026-03-12 18:20:22.213484733 +0000 UTC m=+1062.582111066" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.227792 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" podStartSLOduration=3.035423707 podStartE2EDuration="20.227779087s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:03.84433033 +0000 UTC m=+1044.212956663" lastFinishedPulling="2026-03-12 18:20:21.03668571 +0000 UTC m=+1061.405312043" observedRunningTime="2026-03-12 18:20:22.226942541 +0000 UTC m=+1062.595568874" watchObservedRunningTime="2026-03-12 18:20:22.227779087 +0000 UTC m=+1062.596405420" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.246580 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" podStartSLOduration=3.537968062 podStartE2EDuration="20.246561591s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.329423282 +0000 UTC m=+1044.698049615" lastFinishedPulling="2026-03-12 18:20:21.038016821 +0000 UTC m=+1061.406643144" observedRunningTime="2026-03-12 18:20:22.242988691 +0000 UTC m=+1062.611615024" watchObservedRunningTime="2026-03-12 18:20:22.246561591 +0000 UTC m=+1062.615187924" Mar 12 18:20:22 crc kubenswrapper[4926]: I0312 18:20:22.257952 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" podStartSLOduration=3.563995602 podStartE2EDuration="20.257930105s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.344169091 +0000 UTC m=+1044.712795434" lastFinishedPulling="2026-03-12 18:20:21.038103604 +0000 UTC m=+1061.406729937" observedRunningTime="2026-03-12 18:20:22.256810071 +0000 UTC m=+1062.625436404" watchObservedRunningTime="2026-03-12 18:20:22.257930105 +0000 UTC m=+1062.626556448" Mar 12 18:20:22 crc kubenswrapper[4926]: W0312 18:20:22.759210 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4dea90_1696_4195_a0a0_71a3c9f3e328.slice/crio-88b9fb6167625e93eee57ccc328dfab0817f506629002b33f245a06fcba6ce90 WatchSource:0}: Error finding container 88b9fb6167625e93eee57ccc328dfab0817f506629002b33f245a06fcba6ce90: Status 404 returned error can't find the container with id 88b9fb6167625e93eee57ccc328dfab0817f506629002b33f245a06fcba6ce90 Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.241751 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" event={"ID":"309d4a2a-f738-4d2d-a28e-f361f762f997","Type":"ContainerStarted","Data":"763feb0ffeb642cdca0bbd08795356fb1d89c52e6e0d42195168772ebf5643ff"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.242271 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.246662 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" event={"ID":"14346445-95be-488f-858c-44bf5b45c656","Type":"ContainerStarted","Data":"1e1e50fa2f078b7411a21bc28789c426d7e98f9c928fcaed0397b34bc5590e1a"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.247282 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.249737 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" event={"ID":"2e44c177-b87d-4ff6-80ca-672477fe9e94","Type":"ContainerStarted","Data":"06660c125bb692846ae880a8122d118f266ae4c6008193fb85fbd988d5404f66"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.250144 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.262864 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" podStartSLOduration=3.465089555 podStartE2EDuration="21.262822539s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.042964092 +0000 UTC m=+1045.411590425" lastFinishedPulling="2026-03-12 18:20:22.840697076 +0000 UTC m=+1063.209323409" observedRunningTime="2026-03-12 18:20:23.262688525 +0000 UTC m=+1063.631314858" watchObservedRunningTime="2026-03-12 18:20:23.262822539 +0000 UTC m=+1063.631448872" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.263849 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" event={"ID":"45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd","Type":"ContainerStarted","Data":"c0443003c4743aa7a8eadd5f18ce00657edab23715d985580dfd8099373775a9"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.264493 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.267425 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" event={"ID":"f112cb87-7454-41fa-a1e1-381d79f86247","Type":"ContainerStarted","Data":"72543ac4a663a52b3e29b7d8ba5c55081c5c3f12c96fb35dc79e48ecb592054b"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.267559 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.284687 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" event={"ID":"fea71415-42ac-4e77-ba9c-25170ccece27","Type":"ContainerStarted","Data":"a48c53a7579a238f9cb007586cb237c48f90369196db63fd60dbc80fe597c890"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.285299 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.306188 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" event={"ID":"4800992f-cfad-4a1c-94e5-79427f88c002","Type":"ContainerStarted","Data":"c2e7fdb0fc833c08fda5b2daeee9daeb435751ae04920ae6c3e37e2b25bf03ee"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.306229 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.310671 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" event={"ID":"ccd465e8-6811-4865-9602-3dea8144cc01","Type":"ContainerStarted","Data":"6153011a4207e693ecf652341f8cbc71a87f22cba9fde5b2f3412942f4009725"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.311350 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.314815 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" event={"ID":"fd35525c-7b73-49d1-a36c-c49d3bf933eb","Type":"ContainerStarted","Data":"7aeb0adefead3376b85330b09dcae3d11e44ab37d90aa71590a5de68302d9924"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.319190 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" podStartSLOduration=4.615689543 podStartE2EDuration="21.319176723s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.044068117 +0000 UTC m=+1045.412694450" lastFinishedPulling="2026-03-12 18:20:21.747555297 +0000 UTC m=+1062.116181630" observedRunningTime="2026-03-12 18:20:23.314822578 +0000 UTC m=+1063.683448921" watchObservedRunningTime="2026-03-12 18:20:23.319176723 +0000 UTC m=+1063.687803056" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.319964 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" podStartSLOduration=5.086988966 podStartE2EDuration="21.319957037s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.81242374 +0000 UTC m=+1045.181050073" lastFinishedPulling="2026-03-12 18:20:21.045391811 +0000 UTC m=+1061.414018144" observedRunningTime="2026-03-12 18:20:23.293290417 +0000 UTC m=+1063.661916750" watchObservedRunningTime="2026-03-12 18:20:23.319957037 +0000 UTC m=+1063.688583370" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.322745 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" event={"ID":"495cec8d-a262-4d55-9ee5-6eebb10b6765","Type":"ContainerStarted","Data":"685059c69032e95006cf858050966322f44a4eb6be7907d55e6e91f8dc07ee66"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.322782 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" event={"ID":"495cec8d-a262-4d55-9ee5-6eebb10b6765","Type":"ContainerStarted","Data":"8aa1ab094661eae0b82ab1807123f94c2047ccfbc663b32ef8677c123b3e8cfe"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.323499 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.329152 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" event={"ID":"b776be98-1352-43c6-8ee8-e31076b7d12b","Type":"ContainerStarted","Data":"db53c8de65025967409dd19d5adc0986b9c49d06c657ab39c391e6ced0dff218"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.329791 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.335848 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" podStartSLOduration=4.248416426 podStartE2EDuration="21.335835391s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:03.950133532 +0000 UTC m=+1044.318759865" lastFinishedPulling="2026-03-12 18:20:21.037552497 +0000 UTC m=+1061.406178830" observedRunningTime="2026-03-12 18:20:23.335243292 +0000 UTC m=+1063.703869635" watchObservedRunningTime="2026-03-12 18:20:23.335835391 +0000 UTC m=+1063.704461724" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.342053 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" event={"ID":"5d4dea90-1696-4195-a0a0-71a3c9f3e328","Type":"ContainerStarted","Data":"88b9fb6167625e93eee57ccc328dfab0817f506629002b33f245a06fcba6ce90"} Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.357772 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" podStartSLOduration=4.701462042 podStartE2EDuration="21.357755333s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.38140738 +0000 UTC m=+1044.750033713" lastFinishedPulling="2026-03-12 18:20:21.037700671 +0000 UTC m=+1061.406327004" observedRunningTime="2026-03-12 18:20:23.354696708 +0000 UTC m=+1063.723323041" watchObservedRunningTime="2026-03-12 18:20:23.357755333 +0000 UTC m=+1063.726381676" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.415764 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" podStartSLOduration=5.3691268149999996 podStartE2EDuration="21.415746978s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.005016322 +0000 UTC m=+1045.373642665" lastFinishedPulling="2026-03-12 18:20:21.051636495 +0000 UTC m=+1061.420262828" observedRunningTime="2026-03-12 18:20:23.387379225 +0000 UTC m=+1063.756005558" watchObservedRunningTime="2026-03-12 18:20:23.415746978 +0000 UTC m=+1063.784373311" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.444292 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" podStartSLOduration=5.444544481 podStartE2EDuration="21.444274955s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.059238169 +0000 UTC m=+1045.427864492" lastFinishedPulling="2026-03-12 18:20:21.058968633 +0000 UTC m=+1061.427594966" observedRunningTime="2026-03-12 18:20:23.417080109 +0000 UTC m=+1063.785706442" watchObservedRunningTime="2026-03-12 18:20:23.444274955 +0000 UTC m=+1063.812901288" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.448349 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" podStartSLOduration=4.261933277 podStartE2EDuration="21.448337921s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:03.85043351 +0000 UTC m=+1044.219059843" lastFinishedPulling="2026-03-12 18:20:21.036838154 +0000 UTC m=+1061.405464487" observedRunningTime="2026-03-12 18:20:23.447728132 +0000 UTC m=+1063.816354465" watchObservedRunningTime="2026-03-12 18:20:23.448337921 +0000 UTC m=+1063.816964254" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.564087 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" podStartSLOduration=5.311507463 podStartE2EDuration="21.564070333s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.812800412 +0000 UTC m=+1045.181426745" lastFinishedPulling="2026-03-12 18:20:21.065363282 +0000 UTC m=+1061.433989615" observedRunningTime="2026-03-12 18:20:23.507939066 +0000 UTC m=+1063.876565399" watchObservedRunningTime="2026-03-12 18:20:23.564070333 +0000 UTC m=+1063.932696666" Mar 12 18:20:23 crc kubenswrapper[4926]: I0312 18:20:23.565546 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" podStartSLOduration=20.565542408 podStartE2EDuration="20.565542408s" podCreationTimestamp="2026-03-12 18:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:20:23.56111466 +0000 UTC m=+1063.929740993" watchObservedRunningTime="2026-03-12 18:20:23.565542408 +0000 UTC m=+1063.934168731" Mar 12 18:20:26 crc kubenswrapper[4926]: I0312 18:20:26.817991 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:20:26 crc kubenswrapper[4926]: I0312 18:20:26.818572 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.412899 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" event={"ID":"fd35525c-7b73-49d1-a36c-c49d3bf933eb","Type":"ContainerStarted","Data":"6b651cdbce774b8a347381e80f13dd9f73cc82aaa5c2ba84ea09ce1f53cf5d0b"} Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.413185 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.414855 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" event={"ID":"6374b396-00ef-4aca-ac07-fd46982f23f1","Type":"ContainerStarted","Data":"944c57984d00334e897efd2e8fe4538356b0be1c8b8ec63402fd0cf921161440"} Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.415112 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.416217 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" event={"ID":"5d4dea90-1696-4195-a0a0-71a3c9f3e328","Type":"ContainerStarted","Data":"ac118a798d9939b08069f9d91a941181a79f28ac4bcd090bdaefb227e03d138c"} Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.416281 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.417709 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" event={"ID":"80f26a3f-ab1e-49b1-8843-6674d948a5cd","Type":"ContainerStarted","Data":"61ae93568ede1eb4e0b871691c777333b050ef442c1812ff765143573d2c5149"} Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.417874 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.436651 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" podStartSLOduration=21.072744135 podStartE2EDuration="26.436629897s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:22.80933297 +0000 UTC m=+1063.177959313" lastFinishedPulling="2026-03-12 18:20:28.173218732 +0000 UTC m=+1068.541845075" observedRunningTime="2026-03-12 18:20:28.429280169 +0000 UTC m=+1068.797906542" watchObservedRunningTime="2026-03-12 18:20:28.436629897 +0000 UTC m=+1068.805256230" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.460128 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" podStartSLOduration=21.088855876 podStartE2EDuration="26.460110108s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:22.761599095 +0000 UTC m=+1063.130225428" lastFinishedPulling="2026-03-12 18:20:28.132853317 +0000 UTC m=+1068.501479660" observedRunningTime="2026-03-12 18:20:28.451925484 +0000 UTC m=+1068.820551817" watchObservedRunningTime="2026-03-12 18:20:28.460110108 +0000 UTC m=+1068.828736441" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.475293 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" podStartSLOduration=3.376682504 podStartE2EDuration="26.475275299s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.044307414 +0000 UTC m=+1045.412933747" lastFinishedPulling="2026-03-12 18:20:28.142900199 +0000 UTC m=+1068.511526542" observedRunningTime="2026-03-12 18:20:28.471287126 +0000 UTC m=+1068.839913459" watchObservedRunningTime="2026-03-12 18:20:28.475275299 +0000 UTC m=+1068.843901642" Mar 12 18:20:28 crc kubenswrapper[4926]: I0312 18:20:28.488307 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" podStartSLOduration=2.371981368 podStartE2EDuration="25.488290435s" podCreationTimestamp="2026-03-12 18:20:03 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.019274805 +0000 UTC m=+1045.387901138" lastFinishedPulling="2026-03-12 18:20:28.135583842 +0000 UTC m=+1068.504210205" observedRunningTime="2026-03-12 18:20:28.486817669 +0000 UTC m=+1068.855444002" watchObservedRunningTime="2026-03-12 18:20:28.488290435 +0000 UTC m=+1068.856916768" Mar 12 18:20:29 crc kubenswrapper[4926]: I0312 18:20:29.285776 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-55976db558-2kgv6" Mar 12 18:20:32 crc kubenswrapper[4926]: I0312 18:20:32.768508 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-ssmx7" Mar 12 18:20:32 crc kubenswrapper[4926]: I0312 18:20:32.788977 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zlsrd" Mar 12 18:20:32 crc kubenswrapper[4926]: I0312 18:20:32.820411 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-clngf" Mar 12 18:20:32 crc kubenswrapper[4926]: I0312 18:20:32.883275 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-m6sxh" Mar 12 18:20:32 crc kubenswrapper[4926]: I0312 18:20:32.886060 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-525z5" Mar 12 18:20:32 crc kubenswrapper[4926]: I0312 18:20:32.915001 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-p76mx" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.161815 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-f96w7" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.216356 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-tpnlt" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.236636 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-c58lh" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.262118 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-r2cxk" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.353777 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-njrjb" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.368664 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-dzj5w" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.415767 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xr747" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.579562 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-z87mn" Mar 12 18:20:33 crc kubenswrapper[4926]: I0312 18:20:33.659754 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cn7fq" Mar 12 18:20:38 crc kubenswrapper[4926]: I0312 18:20:38.628910 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-kzl7p" Mar 12 18:20:38 crc kubenswrapper[4926]: I0312 18:20:38.840526 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.544148 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" event={"ID":"682f3a0f-7437-455c-99e1-8b7cdb03328a","Type":"ContainerStarted","Data":"e6e51b327fb4785b9196fc01b7c7230b6d836342a4107acead6dbb71e7f13361"} Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.544896 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" event={"ID":"fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b","Type":"ContainerStarted","Data":"50c65420b38ce4755cb1a02016ba35111df1c92ad7f069eeba953c8cb113d774"} Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.545191 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.546416 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" event={"ID":"c62baaa0-d72b-4240-9dba-858bdf61d1b3","Type":"ContainerStarted","Data":"7a1b9c63ce02ac15ed7d321e38b223df41144d7b29531cda327c0d6a8edc5ea7"} Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.546687 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.547470 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" event={"ID":"464725b8-2734-43a4-a232-5db9bafed311","Type":"ContainerStarted","Data":"9dad9f033bfa07f0de5d8fdcf103a401f35d903a42ba20049335b13790cf0a34"} Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.547796 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.549049 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" event={"ID":"a41da562-a119-4785-95d0-eaf0970a99f4","Type":"ContainerStarted","Data":"8214da748fc5915f21ee97850d91b201196f611f9ddc3bb6fb8cb58347eeabf4"} Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.549515 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.574837 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" podStartSLOduration=4.020561858 podStartE2EDuration="40.574818626s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.770659461 +0000 UTC m=+1045.139285794" lastFinishedPulling="2026-03-12 18:20:41.324916229 +0000 UTC m=+1081.693542562" observedRunningTime="2026-03-12 18:20:42.573286969 +0000 UTC m=+1082.941913312" watchObservedRunningTime="2026-03-12 18:20:42.574818626 +0000 UTC m=+1082.943444969" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.592787 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" podStartSLOduration=4.28515098 podStartE2EDuration="40.592767485s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.017716606 +0000 UTC m=+1045.386342939" lastFinishedPulling="2026-03-12 18:20:41.325333071 +0000 UTC m=+1081.693959444" observedRunningTime="2026-03-12 18:20:42.560770449 +0000 UTC m=+1082.929396782" watchObservedRunningTime="2026-03-12 18:20:42.592767485 +0000 UTC m=+1082.961393818" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.606899 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wqzk9" podStartSLOduration=3.299419974 podStartE2EDuration="39.606881304s" podCreationTimestamp="2026-03-12 18:20:03 +0000 UTC" firstStartedPulling="2026-03-12 18:20:05.018221902 +0000 UTC m=+1045.386848235" lastFinishedPulling="2026-03-12 18:20:41.325683212 +0000 UTC m=+1081.694309565" observedRunningTime="2026-03-12 18:20:42.601817205 +0000 UTC m=+1082.970443538" watchObservedRunningTime="2026-03-12 18:20:42.606881304 +0000 UTC m=+1082.975507637" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.622615 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" podStartSLOduration=4.128823964 podStartE2EDuration="40.622598732s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.829978685 +0000 UTC m=+1045.198605018" lastFinishedPulling="2026-03-12 18:20:41.323753423 +0000 UTC m=+1081.692379786" observedRunningTime="2026-03-12 18:20:42.618836055 +0000 UTC m=+1082.987462388" watchObservedRunningTime="2026-03-12 18:20:42.622598732 +0000 UTC m=+1082.991225065" Mar 12 18:20:42 crc kubenswrapper[4926]: I0312 18:20:42.640534 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" podStartSLOduration=4.148124116 podStartE2EDuration="40.64051137s" podCreationTimestamp="2026-03-12 18:20:02 +0000 UTC" firstStartedPulling="2026-03-12 18:20:04.831493263 +0000 UTC m=+1045.200119596" lastFinishedPulling="2026-03-12 18:20:41.323880517 +0000 UTC m=+1081.692506850" observedRunningTime="2026-03-12 18:20:42.635580326 +0000 UTC m=+1083.004206659" watchObservedRunningTime="2026-03-12 18:20:42.64051137 +0000 UTC m=+1083.009137723" Mar 12 18:20:53 crc kubenswrapper[4926]: I0312 18:20:53.052102 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-7q8xr" Mar 12 18:20:53 crc kubenswrapper[4926]: I0312 18:20:53.159294 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5f6c4" Mar 12 18:20:53 crc kubenswrapper[4926]: I0312 18:20:53.362325 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-cv97b" Mar 12 18:20:53 crc kubenswrapper[4926]: I0312 18:20:53.647465 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-25jrt" Mar 12 18:20:56 crc kubenswrapper[4926]: I0312 18:20:56.817958 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:20:56 crc kubenswrapper[4926]: I0312 18:20:56.818332 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:21:01 crc kubenswrapper[4926]: I0312 18:21:01.361183 4926 scope.go:117] "RemoveContainer" containerID="5d29d1a82fd23f75a9296b7891f008517b3e7945af33e9449e2dfcb52711dd8e" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.977162 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9hlv4"] Mar 12 18:21:12 crc kubenswrapper[4926]: E0312 18:21:12.977865 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50fb579-57d6-4029-a4f3-c8a3303bac4d" containerName="oc" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.977877 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50fb579-57d6-4029-a4f3-c8a3303bac4d" containerName="oc" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.978023 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50fb579-57d6-4029-a4f3-c8a3303bac4d" containerName="oc" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.978698 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.982700 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.982709 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.982700 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.983060 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-ggplz" Mar 12 18:21:12 crc kubenswrapper[4926]: I0312 18:21:12.992662 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9hlv4"] Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.020673 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2x544"] Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.022728 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.032055 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.040992 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2x544"] Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.088062 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-config\") pod \"dnsmasq-dns-675f4bcbfc-9hlv4\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.088342 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwd8\" (UniqueName: \"kubernetes.io/projected/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-kube-api-access-mvwd8\") pod \"dnsmasq-dns-675f4bcbfc-9hlv4\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.189225 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94ql\" (UniqueName: \"kubernetes.io/projected/757e1395-7d78-496a-bd4e-eb7a13b90d10-kube-api-access-x94ql\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.189285 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-config\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.189334 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-config\") pod \"dnsmasq-dns-675f4bcbfc-9hlv4\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.189363 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.189395 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwd8\" (UniqueName: \"kubernetes.io/projected/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-kube-api-access-mvwd8\") pod \"dnsmasq-dns-675f4bcbfc-9hlv4\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.190322 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-config\") pod \"dnsmasq-dns-675f4bcbfc-9hlv4\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.208698 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwd8\" (UniqueName: \"kubernetes.io/projected/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-kube-api-access-mvwd8\") pod \"dnsmasq-dns-675f4bcbfc-9hlv4\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.290866 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-config\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.290948 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.290992 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94ql\" (UniqueName: \"kubernetes.io/projected/757e1395-7d78-496a-bd4e-eb7a13b90d10-kube-api-access-x94ql\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.291919 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.292784 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-config\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.296564 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.314192 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94ql\" (UniqueName: \"kubernetes.io/projected/757e1395-7d78-496a-bd4e-eb7a13b90d10-kube-api-access-x94ql\") pod \"dnsmasq-dns-78dd6ddcc-2x544\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.353748 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.554693 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9hlv4"] Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.634034 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2x544"] Mar 12 18:21:13 crc kubenswrapper[4926]: W0312 18:21:13.635620 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod757e1395_7d78_496a_bd4e_eb7a13b90d10.slice/crio-8ced1e2cf1a0fabd93eb5ebb045e1e613152c80826e96ac3f9497d2c59523b9b WatchSource:0}: Error finding container 8ced1e2cf1a0fabd93eb5ebb045e1e613152c80826e96ac3f9497d2c59523b9b: Status 404 returned error can't find the container with id 8ced1e2cf1a0fabd93eb5ebb045e1e613152c80826e96ac3f9497d2c59523b9b Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.848076 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" event={"ID":"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f","Type":"ContainerStarted","Data":"1cba9386f004d3a4567f81b3b21065a63a088e2f0cf17a6228bba62f9f3a3e10"} Mar 12 18:21:13 crc kubenswrapper[4926]: I0312 18:21:13.849361 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" event={"ID":"757e1395-7d78-496a-bd4e-eb7a13b90d10","Type":"ContainerStarted","Data":"8ced1e2cf1a0fabd93eb5ebb045e1e613152c80826e96ac3f9497d2c59523b9b"} Mar 12 18:21:15 crc kubenswrapper[4926]: I0312 18:21:15.961046 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9hlv4"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.003026 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-px8d4"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.004122 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.037980 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-px8d4"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.134492 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwd9r\" (UniqueName: \"kubernetes.io/projected/6e12f792-0fbe-4996-a6ec-272e5108dd33-kube-api-access-jwd9r\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.134588 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.134664 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-config\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.236130 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.236521 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-config\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.236626 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwd9r\" (UniqueName: \"kubernetes.io/projected/6e12f792-0fbe-4996-a6ec-272e5108dd33-kube-api-access-jwd9r\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.237332 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.237676 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-config\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.259002 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwd9r\" (UniqueName: \"kubernetes.io/projected/6e12f792-0fbe-4996-a6ec-272e5108dd33-kube-api-access-jwd9r\") pod \"dnsmasq-dns-5ccc8479f9-px8d4\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.275933 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2x544"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.304865 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lk7rj"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.306318 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.320669 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lk7rj"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.323019 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.440284 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jt2\" (UniqueName: \"kubernetes.io/projected/5241fdef-63ed-416f-9bf0-c004597cc099-kube-api-access-r6jt2\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.440359 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-config\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.440395 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.541555 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jt2\" (UniqueName: \"kubernetes.io/projected/5241fdef-63ed-416f-9bf0-c004597cc099-kube-api-access-r6jt2\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.541911 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-config\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.541960 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.542846 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.543128 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-config\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.560802 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jt2\" (UniqueName: \"kubernetes.io/projected/5241fdef-63ed-416f-9bf0-c004597cc099-kube-api-access-r6jt2\") pod \"dnsmasq-dns-57d769cc4f-lk7rj\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.615533 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-px8d4"] Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.634739 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:16 crc kubenswrapper[4926]: W0312 18:21:16.640593 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e12f792_0fbe_4996_a6ec_272e5108dd33.slice/crio-9f0b5cf18f405df691a53e33e3f957de2dcab02a1304d75c9067e908e6be9be3 WatchSource:0}: Error finding container 9f0b5cf18f405df691a53e33e3f957de2dcab02a1304d75c9067e908e6be9be3: Status 404 returned error can't find the container with id 9f0b5cf18f405df691a53e33e3f957de2dcab02a1304d75c9067e908e6be9be3 Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.876027 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" event={"ID":"6e12f792-0fbe-4996-a6ec-272e5108dd33","Type":"ContainerStarted","Data":"9f0b5cf18f405df691a53e33e3f957de2dcab02a1304d75c9067e908e6be9be3"} Mar 12 18:21:16 crc kubenswrapper[4926]: I0312 18:21:16.916091 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lk7rj"] Mar 12 18:21:16 crc kubenswrapper[4926]: W0312 18:21:16.927780 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5241fdef_63ed_416f_9bf0_c004597cc099.slice/crio-22b5f71bb0db465dea314a1f0530175669b6d30c754c5e44cf8c2d8ddfdcad3d WatchSource:0}: Error finding container 22b5f71bb0db465dea314a1f0530175669b6d30c754c5e44cf8c2d8ddfdcad3d: Status 404 returned error can't find the container with id 22b5f71bb0db465dea314a1f0530175669b6d30c754c5e44cf8c2d8ddfdcad3d Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.146214 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.147814 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.159978 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.160232 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.160405 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.160581 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.160722 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.160872 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.161074 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r4knd" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.166556 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250210 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250254 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kzhl\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-kube-api-access-8kzhl\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250283 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250304 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250332 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250487 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250562 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250613 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250686 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250759 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.250793 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.351829 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.352100 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.352127 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.352219 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353316 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353390 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kzhl\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-kube-api-access-8kzhl\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353424 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353459 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353485 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353513 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353539 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353562 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.353991 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.354349 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.355882 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.356458 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.356997 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.358307 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.362316 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.362865 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.365302 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.383162 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kzhl\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-kube-api-access-8kzhl\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.393842 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.434737 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.438680 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.445283 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.445920 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.446597 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.446642 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.446797 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k928p" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.446960 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.447266 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.447941 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.503431 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556316 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556370 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7kq2\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-kube-api-access-x7kq2\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556417 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556452 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556472 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9c04aaec-485d-492f-8c24-e6860d9c78f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556668 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556714 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556781 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556851 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556883 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.556932 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9c04aaec-485d-492f-8c24-e6860d9c78f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658741 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658789 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kq2\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-kube-api-access-x7kq2\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658824 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658847 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658868 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9c04aaec-485d-492f-8c24-e6860d9c78f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658902 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658922 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658954 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.658982 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.659003 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.659025 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9c04aaec-485d-492f-8c24-e6860d9c78f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.659337 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.659595 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.660367 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.659334 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.666252 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.668952 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.669456 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.670673 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9c04aaec-485d-492f-8c24-e6860d9c78f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.672105 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.676041 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7kq2\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-kube-api-access-x7kq2\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.684992 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.685104 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9c04aaec-485d-492f-8c24-e6860d9c78f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.769165 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 18:21:17 crc kubenswrapper[4926]: I0312 18:21:17.883421 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" event={"ID":"5241fdef-63ed-416f-9bf0-c004597cc099","Type":"ContainerStarted","Data":"22b5f71bb0db465dea314a1f0530175669b6d30c754c5e44cf8c2d8ddfdcad3d"} Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.327475 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.329233 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.335700 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9fnd5" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.341629 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.342320 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.342737 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.348850 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.354299 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377773 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377818 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377845 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-kolla-config\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377891 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-config-data-default\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377908 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377929 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v6wb\" (UniqueName: \"kubernetes.io/projected/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-kube-api-access-5v6wb\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.377990 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.378020 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.479316 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.479489 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.480700 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486576 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486738 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486782 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486823 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-kolla-config\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486861 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-config-data-default\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486881 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.486903 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v6wb\" (UniqueName: \"kubernetes.io/projected/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-kube-api-access-5v6wb\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.488841 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-kolla-config\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.492836 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.493193 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.494075 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-config-data-default\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.501784 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.512012 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v6wb\" (UniqueName: \"kubernetes.io/projected/1ee00086-3c8a-4f3a-a5d5-9590715a8b95-kube-api-access-5v6wb\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.520975 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"1ee00086-3c8a-4f3a-a5d5-9590715a8b95\") " pod="openstack/openstack-galera-0" Mar 12 18:21:18 crc kubenswrapper[4926]: I0312 18:21:18.653087 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.725561 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.726934 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.730462 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.730730 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.731486 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.731655 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fdjl6" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.741396 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804086 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804135 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804184 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804377 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94d38c16-a6c9-44ed-a49e-398dc34b92ce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804423 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d38c16-a6c9-44ed-a49e-398dc34b92ce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804484 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804543 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d38c16-a6c9-44ed-a49e-398dc34b92ce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.804596 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxz7\" (UniqueName: \"kubernetes.io/projected/94d38c16-a6c9-44ed-a49e-398dc34b92ce-kube-api-access-llxz7\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.905980 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906036 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906106 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906144 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94d38c16-a6c9-44ed-a49e-398dc34b92ce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906169 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d38c16-a6c9-44ed-a49e-398dc34b92ce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906245 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906715 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/94d38c16-a6c9-44ed-a49e-398dc34b92ce-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.906961 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.907039 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d38c16-a6c9-44ed-a49e-398dc34b92ce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.907096 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxz7\" (UniqueName: \"kubernetes.io/projected/94d38c16-a6c9-44ed-a49e-398dc34b92ce-kube-api-access-llxz7\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.907818 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.908097 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.908519 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94d38c16-a6c9-44ed-a49e-398dc34b92ce-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.910913 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d38c16-a6c9-44ed-a49e-398dc34b92ce-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.912153 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d38c16-a6c9-44ed-a49e-398dc34b92ce-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.927482 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxz7\" (UniqueName: \"kubernetes.io/projected/94d38c16-a6c9-44ed-a49e-398dc34b92ce-kube-api-access-llxz7\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:19 crc kubenswrapper[4926]: I0312 18:21:19.933551 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"94d38c16-a6c9-44ed-a49e-398dc34b92ce\") " pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.048850 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.235282 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.236190 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.237837 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-g9zp2" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.244920 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.247228 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.247497 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.319109 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/69083379-a7d7-4876-9955-497420eab579-memcached-tls-certs\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.319274 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69083379-a7d7-4876-9955-497420eab579-kolla-config\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.319326 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69083379-a7d7-4876-9955-497420eab579-config-data\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.319354 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27297\" (UniqueName: \"kubernetes.io/projected/69083379-a7d7-4876-9955-497420eab579-kube-api-access-27297\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.319506 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69083379-a7d7-4876-9955-497420eab579-combined-ca-bundle\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.421954 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27297\" (UniqueName: \"kubernetes.io/projected/69083379-a7d7-4876-9955-497420eab579-kube-api-access-27297\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.422017 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69083379-a7d7-4876-9955-497420eab579-combined-ca-bundle\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.422073 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/69083379-a7d7-4876-9955-497420eab579-memcached-tls-certs\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.422125 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69083379-a7d7-4876-9955-497420eab579-kolla-config\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.422148 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69083379-a7d7-4876-9955-497420eab579-config-data\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.422861 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69083379-a7d7-4876-9955-497420eab579-config-data\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.424190 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69083379-a7d7-4876-9955-497420eab579-kolla-config\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.426418 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/69083379-a7d7-4876-9955-497420eab579-memcached-tls-certs\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.439420 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27297\" (UniqueName: \"kubernetes.io/projected/69083379-a7d7-4876-9955-497420eab579-kube-api-access-27297\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.443159 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69083379-a7d7-4876-9955-497420eab579-combined-ca-bundle\") pod \"memcached-0\" (UID: \"69083379-a7d7-4876-9955-497420eab579\") " pod="openstack/memcached-0" Mar 12 18:21:20 crc kubenswrapper[4926]: I0312 18:21:20.551269 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.368911 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.370090 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.373736 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5578n" Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.383480 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.454950 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjbq\" (UniqueName: \"kubernetes.io/projected/3343d19e-07d3-4de8-954a-f7e31aa8279f-kube-api-access-6bjbq\") pod \"kube-state-metrics-0\" (UID: \"3343d19e-07d3-4de8-954a-f7e31aa8279f\") " pod="openstack/kube-state-metrics-0" Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.556659 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjbq\" (UniqueName: \"kubernetes.io/projected/3343d19e-07d3-4de8-954a-f7e31aa8279f-kube-api-access-6bjbq\") pod \"kube-state-metrics-0\" (UID: \"3343d19e-07d3-4de8-954a-f7e31aa8279f\") " pod="openstack/kube-state-metrics-0" Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.577202 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjbq\" (UniqueName: \"kubernetes.io/projected/3343d19e-07d3-4de8-954a-f7e31aa8279f-kube-api-access-6bjbq\") pod \"kube-state-metrics-0\" (UID: \"3343d19e-07d3-4de8-954a-f7e31aa8279f\") " pod="openstack/kube-state-metrics-0" Mar 12 18:21:22 crc kubenswrapper[4926]: I0312 18:21:22.701181 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.838582 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sfwpr"] Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.839983 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.853504 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sfwpr"] Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.855490 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.855858 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-64822" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.856025 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.880574 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-6znbv"] Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.882576 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.900954 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6znbv"] Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.926616 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-run\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.926675 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-etc-ovs\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.926710 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-run-ovn\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.926741 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-lib\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.926962 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-log\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927023 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f19fbb7-ea3b-437a-a634-498e6a593ef6-ovn-controller-tls-certs\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927054 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-log-ovn\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927098 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30dfa384-92a5-49cf-9793-60478855264f-scripts\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927121 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-run\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927137 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f19fbb7-ea3b-437a-a634-498e6a593ef6-scripts\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927165 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56ctw\" (UniqueName: \"kubernetes.io/projected/8f19fbb7-ea3b-437a-a634-498e6a593ef6-kube-api-access-56ctw\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927250 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpdpt\" (UniqueName: \"kubernetes.io/projected/30dfa384-92a5-49cf-9793-60478855264f-kube-api-access-tpdpt\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.927273 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f19fbb7-ea3b-437a-a634-498e6a593ef6-combined-ca-bundle\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.946648 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.948133 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.954747 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.955142 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-s27qn" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.955284 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.955418 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.955497 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 12 18:21:25 crc kubenswrapper[4926]: I0312 18:21:25.960632 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.028162 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f19fbb7-ea3b-437a-a634-498e6a593ef6-ovn-controller-tls-certs\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029255 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-log-ovn\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029286 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3288572-a9dc-4f96-8535-b05f6f22855b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029306 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029339 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30dfa384-92a5-49cf-9793-60478855264f-scripts\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029360 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f19fbb7-ea3b-437a-a634-498e6a593ef6-scripts\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029383 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-run\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029493 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56ctw\" (UniqueName: \"kubernetes.io/projected/8f19fbb7-ea3b-437a-a634-498e6a593ef6-kube-api-access-56ctw\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029533 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029587 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpdpt\" (UniqueName: \"kubernetes.io/projected/30dfa384-92a5-49cf-9793-60478855264f-kube-api-access-tpdpt\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029608 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f19fbb7-ea3b-437a-a634-498e6a593ef6-combined-ca-bundle\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029652 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029698 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3288572-a9dc-4f96-8535-b05f6f22855b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029763 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-run\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029816 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-etc-ovs\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029853 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-run-ovn\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029904 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-lib\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029977 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.029987 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-log-ovn\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030010 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrgc\" (UniqueName: \"kubernetes.io/projected/c3288572-a9dc-4f96-8535-b05f6f22855b-kube-api-access-chrgc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030068 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3288572-a9dc-4f96-8535-b05f6f22855b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030100 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-log\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030063 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-run\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030314 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-log\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030396 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-run-ovn\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030450 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8f19fbb7-ea3b-437a-a634-498e6a593ef6-var-run\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030573 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-etc-ovs\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.030651 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/30dfa384-92a5-49cf-9793-60478855264f-var-lib\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.031312 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30dfa384-92a5-49cf-9793-60478855264f-scripts\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.031482 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f19fbb7-ea3b-437a-a634-498e6a593ef6-scripts\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.033764 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f19fbb7-ea3b-437a-a634-498e6a593ef6-ovn-controller-tls-certs\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.034424 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f19fbb7-ea3b-437a-a634-498e6a593ef6-combined-ca-bundle\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.048094 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56ctw\" (UniqueName: \"kubernetes.io/projected/8f19fbb7-ea3b-437a-a634-498e6a593ef6-kube-api-access-56ctw\") pod \"ovn-controller-sfwpr\" (UID: \"8f19fbb7-ea3b-437a-a634-498e6a593ef6\") " pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.048415 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpdpt\" (UniqueName: \"kubernetes.io/projected/30dfa384-92a5-49cf-9793-60478855264f-kube-api-access-tpdpt\") pod \"ovn-controller-ovs-6znbv\" (UID: \"30dfa384-92a5-49cf-9793-60478855264f\") " pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131365 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131411 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrgc\" (UniqueName: \"kubernetes.io/projected/c3288572-a9dc-4f96-8535-b05f6f22855b-kube-api-access-chrgc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131430 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3288572-a9dc-4f96-8535-b05f6f22855b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131509 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3288572-a9dc-4f96-8535-b05f6f22855b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131528 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131571 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131596 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131617 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3288572-a9dc-4f96-8535-b05f6f22855b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.131796 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.132477 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c3288572-a9dc-4f96-8535-b05f6f22855b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.132671 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3288572-a9dc-4f96-8535-b05f6f22855b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.132684 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3288572-a9dc-4f96-8535-b05f6f22855b-config\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.135728 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.145744 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.146059 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3288572-a9dc-4f96-8535-b05f6f22855b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.148045 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrgc\" (UniqueName: \"kubernetes.io/projected/c3288572-a9dc-4f96-8535-b05f6f22855b-kube-api-access-chrgc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.161116 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c3288572-a9dc-4f96-8535-b05f6f22855b\") " pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.178566 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.207971 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.273637 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.818087 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.818168 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.818237 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.819151 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10c4816f4e2fc4ce2bc2183a633d9bc53980639515bfce0cf198e862b133fadb"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.819306 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://10c4816f4e2fc4ce2bc2183a633d9bc53980639515bfce0cf198e862b133fadb" gracePeriod=600 Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.965768 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="10c4816f4e2fc4ce2bc2183a633d9bc53980639515bfce0cf198e862b133fadb" exitCode=0 Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.965845 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"10c4816f4e2fc4ce2bc2183a633d9bc53980639515bfce0cf198e862b133fadb"} Mar 12 18:21:26 crc kubenswrapper[4926]: I0312 18:21:26.966034 4926 scope.go:117] "RemoveContainer" containerID="a397bef079b1410b3294983dad25ada9109b1a0eac364c78c0ff4aeeccdf38ed" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.649566 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.650911 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.653386 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-hl59j" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.657182 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.657304 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.657360 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.674589 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685072 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8926ccac-b553-4e37-bbcb-96e3b00c1cab-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685167 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685196 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685259 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdv8t\" (UniqueName: \"kubernetes.io/projected/8926ccac-b553-4e37-bbcb-96e3b00c1cab-kube-api-access-sdv8t\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685302 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685406 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8926ccac-b553-4e37-bbcb-96e3b00c1cab-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685565 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.685642 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8926ccac-b553-4e37-bbcb-96e3b00c1cab-config\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787537 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdv8t\" (UniqueName: \"kubernetes.io/projected/8926ccac-b553-4e37-bbcb-96e3b00c1cab-kube-api-access-sdv8t\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787638 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787677 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8926ccac-b553-4e37-bbcb-96e3b00c1cab-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787739 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787774 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8926ccac-b553-4e37-bbcb-96e3b00c1cab-config\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787813 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8926ccac-b553-4e37-bbcb-96e3b00c1cab-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787907 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.787973 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.788300 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.790408 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8926ccac-b553-4e37-bbcb-96e3b00c1cab-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.792941 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8926ccac-b553-4e37-bbcb-96e3b00c1cab-config\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.793014 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8926ccac-b553-4e37-bbcb-96e3b00c1cab-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.801021 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.801244 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.806576 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8926ccac-b553-4e37-bbcb-96e3b00c1cab-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.807701 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdv8t\" (UniqueName: \"kubernetes.io/projected/8926ccac-b553-4e37-bbcb-96e3b00c1cab-kube-api-access-sdv8t\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.809283 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8926ccac-b553-4e37-bbcb-96e3b00c1cab\") " pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:29 crc kubenswrapper[4926]: I0312 18:21:29.972430 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:30 crc kubenswrapper[4926]: E0312 18:21:30.239177 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 12 18:21:30 crc kubenswrapper[4926]: E0312 18:21:30.239346 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x94ql,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-2x544_openstack(757e1395-7d78-496a-bd4e-eb7a13b90d10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:21:30 crc kubenswrapper[4926]: E0312 18:21:30.240581 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" podUID="757e1395-7d78-496a-bd4e-eb7a13b90d10" Mar 12 18:21:30 crc kubenswrapper[4926]: E0312 18:21:30.354573 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 12 18:21:30 crc kubenswrapper[4926]: E0312 18:21:30.354988 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvwd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-9hlv4_openstack(b74b0a6f-8e6a-4be5-8c67-7473b7c0041f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:21:30 crc kubenswrapper[4926]: E0312 18:21:30.356290 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" podUID="b74b0a6f-8e6a-4be5-8c67-7473b7c0041f" Mar 12 18:21:30 crc kubenswrapper[4926]: I0312 18:21:30.998007 4926 generic.go:334] "Generic (PLEG): container finished" podID="5241fdef-63ed-416f-9bf0-c004597cc099" containerID="14fbc530f0eccf6bba2b3c2c222ca781991938a69c4f20b33818d33df7d57e2d" exitCode=0 Mar 12 18:21:30 crc kubenswrapper[4926]: I0312 18:21:30.998503 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" event={"ID":"5241fdef-63ed-416f-9bf0-c004597cc099","Type":"ContainerDied","Data":"14fbc530f0eccf6bba2b3c2c222ca781991938a69c4f20b33818d33df7d57e2d"} Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.013131 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"9728bd6132bdd9ab31a71d0a44779a02f515c39e712bb7cc4f8a85610efe739f"} Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.019566 4926 generic.go:334] "Generic (PLEG): container finished" podID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerID="70b08923dd3c9c83aaab8247e1a7a0c78b75d9c1349d70ce24d38f2bd6316ae1" exitCode=0 Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.019691 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" event={"ID":"6e12f792-0fbe-4996-a6ec-272e5108dd33","Type":"ContainerDied","Data":"70b08923dd3c9c83aaab8247e1a7a0c78b75d9c1349d70ce24d38f2bd6316ae1"} Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.079375 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.131617 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.142105 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.177910 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.210903 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.226305 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sfwpr"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.259935 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.332192 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.532384 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.537360 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.685975 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 18:21:31 crc kubenswrapper[4926]: W0312 18:21:31.694540 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8926ccac_b553_4e37_bbcb_96e3b00c1cab.slice/crio-0e9f3a68e6e59617bb832dfb4b84662a80a7305b65003215be030597cbad60c8 WatchSource:0}: Error finding container 0e9f3a68e6e59617bb832dfb4b84662a80a7305b65003215be030597cbad60c8: Status 404 returned error can't find the container with id 0e9f3a68e6e59617bb832dfb4b84662a80a7305b65003215be030597cbad60c8 Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.733285 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-dns-svc\") pod \"757e1395-7d78-496a-bd4e-eb7a13b90d10\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.733781 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x94ql\" (UniqueName: \"kubernetes.io/projected/757e1395-7d78-496a-bd4e-eb7a13b90d10-kube-api-access-x94ql\") pod \"757e1395-7d78-496a-bd4e-eb7a13b90d10\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.733890 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "757e1395-7d78-496a-bd4e-eb7a13b90d10" (UID: "757e1395-7d78-496a-bd4e-eb7a13b90d10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.733921 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwd8\" (UniqueName: \"kubernetes.io/projected/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-kube-api-access-mvwd8\") pod \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.733966 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-config\") pod \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\" (UID: \"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f\") " Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.734004 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-config\") pod \"757e1395-7d78-496a-bd4e-eb7a13b90d10\" (UID: \"757e1395-7d78-496a-bd4e-eb7a13b90d10\") " Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.734376 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.734433 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-config" (OuterVolumeSpecName: "config") pod "757e1395-7d78-496a-bd4e-eb7a13b90d10" (UID: "757e1395-7d78-496a-bd4e-eb7a13b90d10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.734485 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-config" (OuterVolumeSpecName: "config") pod "b74b0a6f-8e6a-4be5-8c67-7473b7c0041f" (UID: "b74b0a6f-8e6a-4be5-8c67-7473b7c0041f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.739957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-kube-api-access-mvwd8" (OuterVolumeSpecName: "kube-api-access-mvwd8") pod "b74b0a6f-8e6a-4be5-8c67-7473b7c0041f" (UID: "b74b0a6f-8e6a-4be5-8c67-7473b7c0041f"). InnerVolumeSpecName "kube-api-access-mvwd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.740585 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757e1395-7d78-496a-bd4e-eb7a13b90d10-kube-api-access-x94ql" (OuterVolumeSpecName: "kube-api-access-x94ql") pod "757e1395-7d78-496a-bd4e-eb7a13b90d10" (UID: "757e1395-7d78-496a-bd4e-eb7a13b90d10"). InnerVolumeSpecName "kube-api-access-x94ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.790538 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-6znbv"] Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.836124 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x94ql\" (UniqueName: \"kubernetes.io/projected/757e1395-7d78-496a-bd4e-eb7a13b90d10-kube-api-access-x94ql\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.836155 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwd8\" (UniqueName: \"kubernetes.io/projected/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-kube-api-access-mvwd8\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.836166 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:31 crc kubenswrapper[4926]: I0312 18:21:31.836176 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757e1395-7d78-496a-bd4e-eb7a13b90d10-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:31 crc kubenswrapper[4926]: W0312 18:21:31.875687 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30dfa384_92a5_49cf_9793_60478855264f.slice/crio-01c671ef539d352b60786d84428aff26871c5405e87f77c3ee1ad41b2dca46ff WatchSource:0}: Error finding container 01c671ef539d352b60786d84428aff26871c5405e87f77c3ee1ad41b2dca46ff: Status 404 returned error can't find the container with id 01c671ef539d352b60786d84428aff26871c5405e87f77c3ee1ad41b2dca46ff Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.031801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c3288572-a9dc-4f96-8535-b05f6f22855b","Type":"ContainerStarted","Data":"a78b120c7cf2870625a84495d8e3a18d8ff80a83d9bd025f6f535e76a6f9b7bf"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.043365 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" event={"ID":"b74b0a6f-8e6a-4be5-8c67-7473b7c0041f","Type":"ContainerDied","Data":"1cba9386f004d3a4567f81b3b21065a63a088e2f0cf17a6228bba62f9f3a3e10"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.043575 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-9hlv4" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.058730 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" event={"ID":"6e12f792-0fbe-4996-a6ec-272e5108dd33","Type":"ContainerStarted","Data":"2f4a53f9672845382051e0baaa943aad6fd054f0217c765f521ce6c2186e70ad"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.058907 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.066031 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" event={"ID":"5241fdef-63ed-416f-9bf0-c004597cc099","Type":"ContainerStarted","Data":"40595f3e3c0223d3ed3f1599101a865666d9ee1a40df2b9373fe5655ddc620fa"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.066120 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.067552 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8926ccac-b553-4e37-bbcb-96e3b00c1cab","Type":"ContainerStarted","Data":"0e9f3a68e6e59617bb832dfb4b84662a80a7305b65003215be030597cbad60c8"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.069262 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6znbv" event={"ID":"30dfa384-92a5-49cf-9793-60478855264f","Type":"ContainerStarted","Data":"01c671ef539d352b60786d84428aff26871c5405e87f77c3ee1ad41b2dca46ff"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.070821 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.070864 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-2x544" event={"ID":"757e1395-7d78-496a-bd4e-eb7a13b90d10","Type":"ContainerDied","Data":"8ced1e2cf1a0fabd93eb5ebb045e1e613152c80826e96ac3f9497d2c59523b9b"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.074331 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"94d38c16-a6c9-44ed-a49e-398dc34b92ce","Type":"ContainerStarted","Data":"8cffc6f2cf22e065c646327d7e2376f07621924b60d95b70b11c502492ded6be"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.075514 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"06f09c04-6c8d-4c47-a0a5-59def6ebbf94","Type":"ContainerStarted","Data":"851881376da52e23e91e40413019b432ba8d1a58a75cb35642884721899d9a50"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.091697 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9c04aaec-485d-492f-8c24-e6860d9c78f7","Type":"ContainerStarted","Data":"1dfe0defccd325a5f2aba559c2e683f83b8337f73cdd57e5ea2b609807eb38bf"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.102021 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" podStartSLOduration=3.294161679 podStartE2EDuration="17.101998319s" podCreationTimestamp="2026-03-12 18:21:15 +0000 UTC" firstStartedPulling="2026-03-12 18:21:16.647341772 +0000 UTC m=+1117.015968105" lastFinishedPulling="2026-03-12 18:21:30.455178392 +0000 UTC m=+1130.823804745" observedRunningTime="2026-03-12 18:21:32.084191124 +0000 UTC m=+1132.452817457" watchObservedRunningTime="2026-03-12 18:21:32.101998319 +0000 UTC m=+1132.470624652" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.109721 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"69083379-a7d7-4876-9955-497420eab579","Type":"ContainerStarted","Data":"4038b17eb45e5fa7b20e5c814863f1f8b28f82cb2cfecff14d8f152dce6de6b8"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.111235 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3343d19e-07d3-4de8-954a-f7e31aa8279f","Type":"ContainerStarted","Data":"a5a3fb3c5e3c756132a6d36939e2dc50de833c191b34382d6e423a84d04ecf45"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.117600 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" podStartSLOduration=2.600653134 podStartE2EDuration="16.117578673s" podCreationTimestamp="2026-03-12 18:21:16 +0000 UTC" firstStartedPulling="2026-03-12 18:21:16.936092206 +0000 UTC m=+1117.304718529" lastFinishedPulling="2026-03-12 18:21:30.453017735 +0000 UTC m=+1130.821644068" observedRunningTime="2026-03-12 18:21:32.108322856 +0000 UTC m=+1132.476949179" watchObservedRunningTime="2026-03-12 18:21:32.117578673 +0000 UTC m=+1132.486205006" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.119561 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr" event={"ID":"8f19fbb7-ea3b-437a-a634-498e6a593ef6","Type":"ContainerStarted","Data":"0dee60d04f094592a5abba55f5c41b4de3a7242ba688e0eefb63ed7c3283abf5"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.126018 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1ee00086-3c8a-4f3a-a5d5-9590715a8b95","Type":"ContainerStarted","Data":"9927d7adb7f8c599b3e1dbf5bd1405228ef1744ac07a01bf6d24cd343a94144e"} Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.169502 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9hlv4"] Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.175337 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-9hlv4"] Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.194461 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2x544"] Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.206969 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-2x544"] Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.500109 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757e1395-7d78-496a-bd4e-eb7a13b90d10" path="/var/lib/kubelet/pods/757e1395-7d78-496a-bd4e-eb7a13b90d10/volumes" Mar 12 18:21:32 crc kubenswrapper[4926]: I0312 18:21:32.500566 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74b0a6f-8e6a-4be5-8c67-7473b7c0041f" path="/var/lib/kubelet/pods/b74b0a6f-8e6a-4be5-8c67-7473b7c0041f/volumes" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.759922 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mszqk"] Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.793046 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mszqk"] Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.793160 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.795554 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.888483 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.888522 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-ovs-rundir\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.888545 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-combined-ca-bundle\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.888570 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-ovn-rundir\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.888611 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-config\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.888633 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhxd2\" (UniqueName: \"kubernetes.io/projected/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-kube-api-access-nhxd2\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.928737 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-px8d4"] Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.966335 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6csg8"] Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.967552 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.969940 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.984334 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6csg8"] Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990281 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-ovs-rundir\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990330 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990352 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-combined-ca-bundle\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990380 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-ovn-rundir\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990422 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990460 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-config\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990484 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhxd2\" (UniqueName: \"kubernetes.io/projected/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-kube-api-access-nhxd2\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990529 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-config\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990562 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28bxt\" (UniqueName: \"kubernetes.io/projected/10777956-1f18-4bd0-9790-aae020334e2c-kube-api-access-28bxt\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990598 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.990968 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-ovs-rundir\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.991508 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-ovn-rundir\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.992272 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-config\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:33 crc kubenswrapper[4926]: I0312 18:21:33.999133 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-combined-ca-bundle\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.008979 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.018141 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhxd2\" (UniqueName: \"kubernetes.io/projected/b4a45633-8ac7-497f-a8d7-2b7a3dad35bc-kube-api-access-nhxd2\") pod \"ovn-controller-metrics-mszqk\" (UID: \"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc\") " pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.081835 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lk7rj"] Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.092225 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-config\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.092304 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28bxt\" (UniqueName: \"kubernetes.io/projected/10777956-1f18-4bd0-9790-aae020334e2c-kube-api-access-28bxt\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.092364 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.092453 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.093417 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.093459 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-config\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.093458 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.124333 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-htjb7"] Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.127103 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.127951 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28bxt\" (UniqueName: \"kubernetes.io/projected/10777956-1f18-4bd0-9790-aae020334e2c-kube-api-access-28bxt\") pod \"dnsmasq-dns-7fd796d7df-6csg8\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.130555 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mszqk" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.133884 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.144949 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="dnsmasq-dns" containerID="cri-o://40595f3e3c0223d3ed3f1599101a865666d9ee1a40df2b9373fe5655ddc620fa" gracePeriod=10 Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.145289 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="dnsmasq-dns" containerID="cri-o://2f4a53f9672845382051e0baaa943aad6fd054f0217c765f521ce6c2186e70ad" gracePeriod=10 Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.148713 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-htjb7"] Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.195142 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.195179 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.195213 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.195264 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7c2g\" (UniqueName: \"kubernetes.io/projected/7919b4ef-74fc-472e-9a5e-04216cc51ae5-kube-api-access-l7c2g\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.195462 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-config\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.287072 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.297067 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-config\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.297148 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.297172 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.297204 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.297230 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7c2g\" (UniqueName: \"kubernetes.io/projected/7919b4ef-74fc-472e-9a5e-04216cc51ae5-kube-api-access-l7c2g\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.298201 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.298243 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.298259 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.299466 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-config\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.317216 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7c2g\" (UniqueName: \"kubernetes.io/projected/7919b4ef-74fc-472e-9a5e-04216cc51ae5-kube-api-access-l7c2g\") pod \"dnsmasq-dns-86db49b7ff-htjb7\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:34 crc kubenswrapper[4926]: I0312 18:21:34.460706 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:35 crc kubenswrapper[4926]: I0312 18:21:35.164865 4926 generic.go:334] "Generic (PLEG): container finished" podID="5241fdef-63ed-416f-9bf0-c004597cc099" containerID="40595f3e3c0223d3ed3f1599101a865666d9ee1a40df2b9373fe5655ddc620fa" exitCode=0 Mar 12 18:21:35 crc kubenswrapper[4926]: I0312 18:21:35.164965 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" event={"ID":"5241fdef-63ed-416f-9bf0-c004597cc099","Type":"ContainerDied","Data":"40595f3e3c0223d3ed3f1599101a865666d9ee1a40df2b9373fe5655ddc620fa"} Mar 12 18:21:35 crc kubenswrapper[4926]: I0312 18:21:35.167498 4926 generic.go:334] "Generic (PLEG): container finished" podID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerID="2f4a53f9672845382051e0baaa943aad6fd054f0217c765f521ce6c2186e70ad" exitCode=0 Mar 12 18:21:35 crc kubenswrapper[4926]: I0312 18:21:35.167524 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" event={"ID":"6e12f792-0fbe-4996-a6ec-272e5108dd33","Type":"ContainerDied","Data":"2f4a53f9672845382051e0baaa943aad6fd054f0217c765f521ce6c2186e70ad"} Mar 12 18:21:36 crc kubenswrapper[4926]: I0312 18:21:36.326714 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: connect: connection refused" Mar 12 18:21:40 crc kubenswrapper[4926]: E0312 18:21:40.390801 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = reading blob sha256:49c8ff4b1361fbc5f5e8704c41aa7b56c23335467834ce27b461e2430f9aa48d: Digest did not match, expected sha256:49c8ff4b1361fbc5f5e8704c41aa7b56c23335467834ce27b461e2430f9aa48d, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Mar 12 18:21:40 crc kubenswrapper[4926]: E0312 18:21:40.391527 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n87h665h5cfh55dh5chdfhdch55fh7ch56dhc8h9bh5d8hb7hc9h54ch597h66fh7fh645h5ddh56ch554h5b8h5f6h5cfh5f6h5b8hfh5bdh58dh5ddq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56ctw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-sfwpr_openstack(8f19fbb7-ea3b-437a-a634-498e6a593ef6): ErrImagePull: reading blob sha256:49c8ff4b1361fbc5f5e8704c41aa7b56c23335467834ce27b461e2430f9aa48d: Digest did not match, expected sha256:49c8ff4b1361fbc5f5e8704c41aa7b56c23335467834ce27b461e2430f9aa48d, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855" logger="UnhandledError" Mar 12 18:21:40 crc kubenswrapper[4926]: E0312 18:21:40.393040 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"reading blob sha256:49c8ff4b1361fbc5f5e8704c41aa7b56c23335467834ce27b461e2430f9aa48d: Digest did not match, expected sha256:49c8ff4b1361fbc5f5e8704c41aa7b56c23335467834ce27b461e2430f9aa48d, got sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\"" pod="openstack/ovn-controller-sfwpr" podUID="8f19fbb7-ea3b-437a-a634-498e6a593ef6" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.408601 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.603841 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-dns-svc\") pod \"5241fdef-63ed-416f-9bf0-c004597cc099\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.604244 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6jt2\" (UniqueName: \"kubernetes.io/projected/5241fdef-63ed-416f-9bf0-c004597cc099-kube-api-access-r6jt2\") pod \"5241fdef-63ed-416f-9bf0-c004597cc099\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.608059 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-config\") pod \"5241fdef-63ed-416f-9bf0-c004597cc099\" (UID: \"5241fdef-63ed-416f-9bf0-c004597cc099\") " Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.609622 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5241fdef-63ed-416f-9bf0-c004597cc099-kube-api-access-r6jt2" (OuterVolumeSpecName: "kube-api-access-r6jt2") pod "5241fdef-63ed-416f-9bf0-c004597cc099" (UID: "5241fdef-63ed-416f-9bf0-c004597cc099"). InnerVolumeSpecName "kube-api-access-r6jt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.699951 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-config" (OuterVolumeSpecName: "config") pod "5241fdef-63ed-416f-9bf0-c004597cc099" (UID: "5241fdef-63ed-416f-9bf0-c004597cc099"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.700106 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5241fdef-63ed-416f-9bf0-c004597cc099" (UID: "5241fdef-63ed-416f-9bf0-c004597cc099"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.707712 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.710144 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.710183 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6jt2\" (UniqueName: \"kubernetes.io/projected/5241fdef-63ed-416f-9bf0-c004597cc099-kube-api-access-r6jt2\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.710193 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5241fdef-63ed-416f-9bf0-c004597cc099-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.811805 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-config\") pod \"6e12f792-0fbe-4996-a6ec-272e5108dd33\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.813324 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwd9r\" (UniqueName: \"kubernetes.io/projected/6e12f792-0fbe-4996-a6ec-272e5108dd33-kube-api-access-jwd9r\") pod \"6e12f792-0fbe-4996-a6ec-272e5108dd33\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.813565 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-dns-svc\") pod \"6e12f792-0fbe-4996-a6ec-272e5108dd33\" (UID: \"6e12f792-0fbe-4996-a6ec-272e5108dd33\") " Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.821947 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e12f792-0fbe-4996-a6ec-272e5108dd33-kube-api-access-jwd9r" (OuterVolumeSpecName: "kube-api-access-jwd9r") pod "6e12f792-0fbe-4996-a6ec-272e5108dd33" (UID: "6e12f792-0fbe-4996-a6ec-272e5108dd33"). InnerVolumeSpecName "kube-api-access-jwd9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.841489 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e12f792-0fbe-4996-a6ec-272e5108dd33" (UID: "6e12f792-0fbe-4996-a6ec-272e5108dd33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.848113 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-config" (OuterVolumeSpecName: "config") pod "6e12f792-0fbe-4996-a6ec-272e5108dd33" (UID: "6e12f792-0fbe-4996-a6ec-272e5108dd33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.915734 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.915759 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e12f792-0fbe-4996-a6ec-272e5108dd33-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:40 crc kubenswrapper[4926]: I0312 18:21:40.915790 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwd9r\" (UniqueName: \"kubernetes.io/projected/6e12f792-0fbe-4996-a6ec-272e5108dd33-kube-api-access-jwd9r\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.229933 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" event={"ID":"5241fdef-63ed-416f-9bf0-c004597cc099","Type":"ContainerDied","Data":"22b5f71bb0db465dea314a1f0530175669b6d30c754c5e44cf8c2d8ddfdcad3d"} Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.229969 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.230038 4926 scope.go:117] "RemoveContainer" containerID="40595f3e3c0223d3ed3f1599101a865666d9ee1a40df2b9373fe5655ddc620fa" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.234618 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" event={"ID":"6e12f792-0fbe-4996-a6ec-272e5108dd33","Type":"ContainerDied","Data":"9f0b5cf18f405df691a53e33e3f957de2dcab02a1304d75c9067e908e6be9be3"} Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.234754 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-px8d4" Mar 12 18:21:41 crc kubenswrapper[4926]: E0312 18:21:41.235808 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-sfwpr" podUID="8f19fbb7-ea3b-437a-a634-498e6a593ef6" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.278681 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lk7rj"] Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.290539 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lk7rj"] Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.299386 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-px8d4"] Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.307385 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-px8d4"] Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.637138 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-lk7rj" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: i/o timeout" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.745878 4926 scope.go:117] "RemoveContainer" containerID="14fbc530f0eccf6bba2b3c2c222ca781991938a69c4f20b33818d33df7d57e2d" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.804380 4926 scope.go:117] "RemoveContainer" containerID="2f4a53f9672845382051e0baaa943aad6fd054f0217c765f521ce6c2186e70ad" Mar 12 18:21:41 crc kubenswrapper[4926]: I0312 18:21:41.861963 4926 scope.go:117] "RemoveContainer" containerID="70b08923dd3c9c83aaab8247e1a7a0c78b75d9c1349d70ce24d38f2bd6316ae1" Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.128349 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mszqk"] Mar 12 18:21:42 crc kubenswrapper[4926]: W0312 18:21:42.140714 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4a45633_8ac7_497f_a8d7_2b7a3dad35bc.slice/crio-f7c4dce115367598ad2159524b93cf71ec6949a14754d95aea2017f89097389d WatchSource:0}: Error finding container f7c4dce115367598ad2159524b93cf71ec6949a14754d95aea2017f89097389d: Status 404 returned error can't find the container with id f7c4dce115367598ad2159524b93cf71ec6949a14754d95aea2017f89097389d Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.243794 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6csg8"] Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.246997 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c3288572-a9dc-4f96-8535-b05f6f22855b","Type":"ContainerStarted","Data":"781ea0a760f08140b9883c5fed3a171df1919c1af90c679e6f40c64e3a5fa919"} Mar 12 18:21:42 crc kubenswrapper[4926]: W0312 18:21:42.250681 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10777956_1f18_4bd0_9790_aae020334e2c.slice/crio-1fc31c043cd931246845186f093dded25733b7a182735715b314acd79d3a77b3 WatchSource:0}: Error finding container 1fc31c043cd931246845186f093dded25733b7a182735715b314acd79d3a77b3: Status 404 returned error can't find the container with id 1fc31c043cd931246845186f093dded25733b7a182735715b314acd79d3a77b3 Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.250815 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1ee00086-3c8a-4f3a-a5d5-9590715a8b95","Type":"ContainerStarted","Data":"cfea673c23f1afedb464a97e3d967b55a20c7c92aa61908df2cb8c1bc70bc1ba"} Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.253065 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8926ccac-b553-4e37-bbcb-96e3b00c1cab","Type":"ContainerStarted","Data":"8715b23c355f385951aad7ac8c2327f6b2a3947620ff78fcfcac3b66d8b5d8ed"} Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.257843 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mszqk" event={"ID":"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc","Type":"ContainerStarted","Data":"f7c4dce115367598ad2159524b93cf71ec6949a14754d95aea2017f89097389d"} Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.259739 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"94d38c16-a6c9-44ed-a49e-398dc34b92ce","Type":"ContainerStarted","Data":"fc66a1cd90ab8ffc8ce1b57a7c53af70932339daf452369540b2057b973e9dcc"} Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.261260 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"69083379-a7d7-4876-9955-497420eab579","Type":"ContainerStarted","Data":"78e04a3036a523b939ce014c96a8e4153242d2617ecfecaceb5bcfc558e634db"} Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.261662 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.264140 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3343d19e-07d3-4de8-954a-f7e31aa8279f","Type":"ContainerStarted","Data":"c936fa8962854c1d8665ba026930a8b325915da3444b99cba8eeb7debd2bb042"} Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.264271 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.298952 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.536803277 podStartE2EDuration="22.298928407s" podCreationTimestamp="2026-03-12 18:21:20 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.144925212 +0000 UTC m=+1131.513551545" lastFinishedPulling="2026-03-12 18:21:40.907050342 +0000 UTC m=+1141.275676675" observedRunningTime="2026-03-12 18:21:42.298627567 +0000 UTC m=+1142.667253900" watchObservedRunningTime="2026-03-12 18:21:42.298928407 +0000 UTC m=+1142.667554740" Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.327921 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.651873875 podStartE2EDuration="20.327903948s" podCreationTimestamp="2026-03-12 18:21:22 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.075922266 +0000 UTC m=+1131.444548609" lastFinishedPulling="2026-03-12 18:21:41.751952349 +0000 UTC m=+1142.120578682" observedRunningTime="2026-03-12 18:21:42.324647527 +0000 UTC m=+1142.693273860" watchObservedRunningTime="2026-03-12 18:21:42.327903948 +0000 UTC m=+1142.696530281" Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.331034 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-htjb7"] Mar 12 18:21:42 crc kubenswrapper[4926]: W0312 18:21:42.377170 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7919b4ef_74fc_472e_9a5e_04216cc51ae5.slice/crio-4cd40efc4978929f55aec279245d0d1733c0429e93e1c880bdcbc1f4f5fc9094 WatchSource:0}: Error finding container 4cd40efc4978929f55aec279245d0d1733c0429e93e1c880bdcbc1f4f5fc9094: Status 404 returned error can't find the container with id 4cd40efc4978929f55aec279245d0d1733c0429e93e1c880bdcbc1f4f5fc9094 Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.500900 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" path="/var/lib/kubelet/pods/5241fdef-63ed-416f-9bf0-c004597cc099/volumes" Mar 12 18:21:42 crc kubenswrapper[4926]: I0312 18:21:42.501471 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" path="/var/lib/kubelet/pods/6e12f792-0fbe-4996-a6ec-272e5108dd33/volumes" Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.274396 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"06f09c04-6c8d-4c47-a0a5-59def6ebbf94","Type":"ContainerStarted","Data":"cfa97fbf75b6c2e852f5d66a227450ea6c8da1b0641deb3da6fb3c35ba9a0f12"} Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.278493 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9c04aaec-485d-492f-8c24-e6860d9c78f7","Type":"ContainerStarted","Data":"68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5"} Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.280770 4926 generic.go:334] "Generic (PLEG): container finished" podID="30dfa384-92a5-49cf-9793-60478855264f" containerID="88c96a549cb85f5f7b5e96465e3531e8640e2842335f5511d9adb4f48dd39fe6" exitCode=0 Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.280823 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6znbv" event={"ID":"30dfa384-92a5-49cf-9793-60478855264f","Type":"ContainerDied","Data":"88c96a549cb85f5f7b5e96465e3531e8640e2842335f5511d9adb4f48dd39fe6"} Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.282204 4926 generic.go:334] "Generic (PLEG): container finished" podID="10777956-1f18-4bd0-9790-aae020334e2c" containerID="07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586" exitCode=0 Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.282257 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" event={"ID":"10777956-1f18-4bd0-9790-aae020334e2c","Type":"ContainerDied","Data":"07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586"} Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.282278 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" event={"ID":"10777956-1f18-4bd0-9790-aae020334e2c","Type":"ContainerStarted","Data":"1fc31c043cd931246845186f093dded25733b7a182735715b314acd79d3a77b3"} Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.284061 4926 generic.go:334] "Generic (PLEG): container finished" podID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerID="d971505f58d662cfcbae50ecf97fffad61988b03d55fb8228578b1f86349cea3" exitCode=0 Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.284882 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" event={"ID":"7919b4ef-74fc-472e-9a5e-04216cc51ae5","Type":"ContainerDied","Data":"d971505f58d662cfcbae50ecf97fffad61988b03d55fb8228578b1f86349cea3"} Mar 12 18:21:43 crc kubenswrapper[4926]: I0312 18:21:43.284906 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" event={"ID":"7919b4ef-74fc-472e-9a5e-04216cc51ae5","Type":"ContainerStarted","Data":"4cd40efc4978929f55aec279245d0d1733c0429e93e1c880bdcbc1f4f5fc9094"} Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.299944 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" event={"ID":"7919b4ef-74fc-472e-9a5e-04216cc51ae5","Type":"ContainerStarted","Data":"bc88a073a21581ac4854225ad8d64ba64f0d8303fd8d746c072ccb3cc21dc625"} Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.300456 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.303526 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6znbv" event={"ID":"30dfa384-92a5-49cf-9793-60478855264f","Type":"ContainerStarted","Data":"ab5ddbfb0988a75cf57e9b5e666e014d1a4bc30b2f8e2f1f1fae6d36d374f8a0"} Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.303603 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-6znbv" event={"ID":"30dfa384-92a5-49cf-9793-60478855264f","Type":"ContainerStarted","Data":"8014dffc7c55d0d12ba9b5c567fd6c20c5f5e73dfb706ab9e919e48b133e0887"} Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.303894 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.315075 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" event={"ID":"10777956-1f18-4bd0-9790-aae020334e2c","Type":"ContainerStarted","Data":"40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d"} Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.322243 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" podStartSLOduration=10.322229426 podStartE2EDuration="10.322229426s" podCreationTimestamp="2026-03-12 18:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:21:44.320413339 +0000 UTC m=+1144.689039672" watchObservedRunningTime="2026-03-12 18:21:44.322229426 +0000 UTC m=+1144.690855759" Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.341547 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" podStartSLOduration=11.341523406 podStartE2EDuration="11.341523406s" podCreationTimestamp="2026-03-12 18:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:21:44.33909138 +0000 UTC m=+1144.707717723" watchObservedRunningTime="2026-03-12 18:21:44.341523406 +0000 UTC m=+1144.710149749" Mar 12 18:21:44 crc kubenswrapper[4926]: I0312 18:21:44.367453 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-6znbv" podStartSLOduration=9.528671949 podStartE2EDuration="19.367424432s" podCreationTimestamp="2026-03-12 18:21:25 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.878579328 +0000 UTC m=+1132.247205661" lastFinishedPulling="2026-03-12 18:21:41.717331801 +0000 UTC m=+1142.085958144" observedRunningTime="2026-03-12 18:21:44.360483965 +0000 UTC m=+1144.729110318" watchObservedRunningTime="2026-03-12 18:21:44.367424432 +0000 UTC m=+1144.736050755" Mar 12 18:21:45 crc kubenswrapper[4926]: I0312 18:21:45.322813 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:21:45 crc kubenswrapper[4926]: I0312 18:21:45.322881 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:46 crc kubenswrapper[4926]: E0312 18:21:46.022290 4926 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94d38c16_a6c9_44ed_a49e_398dc34b92ce.slice/crio-fc66a1cd90ab8ffc8ce1b57a7c53af70932339daf452369540b2057b973e9dcc.scope\": RecentStats: unable to find data in memory cache]" Mar 12 18:21:46 crc kubenswrapper[4926]: I0312 18:21:46.335966 4926 generic.go:334] "Generic (PLEG): container finished" podID="94d38c16-a6c9-44ed-a49e-398dc34b92ce" containerID="fc66a1cd90ab8ffc8ce1b57a7c53af70932339daf452369540b2057b973e9dcc" exitCode=0 Mar 12 18:21:46 crc kubenswrapper[4926]: I0312 18:21:46.336039 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"94d38c16-a6c9-44ed-a49e-398dc34b92ce","Type":"ContainerDied","Data":"fc66a1cd90ab8ffc8ce1b57a7c53af70932339daf452369540b2057b973e9dcc"} Mar 12 18:21:46 crc kubenswrapper[4926]: I0312 18:21:46.338309 4926 generic.go:334] "Generic (PLEG): container finished" podID="1ee00086-3c8a-4f3a-a5d5-9590715a8b95" containerID="cfea673c23f1afedb464a97e3d967b55a20c7c92aa61908df2cb8c1bc70bc1ba" exitCode=0 Mar 12 18:21:46 crc kubenswrapper[4926]: I0312 18:21:46.339062 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1ee00086-3c8a-4f3a-a5d5-9590715a8b95","Type":"ContainerDied","Data":"cfea673c23f1afedb464a97e3d967b55a20c7c92aa61908df2cb8c1bc70bc1ba"} Mar 12 18:21:49 crc kubenswrapper[4926]: I0312 18:21:49.289626 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:49 crc kubenswrapper[4926]: I0312 18:21:49.462577 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:21:49 crc kubenswrapper[4926]: I0312 18:21:49.555453 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6csg8"] Mar 12 18:21:49 crc kubenswrapper[4926]: I0312 18:21:49.556025 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" podUID="10777956-1f18-4bd0-9790-aae020334e2c" containerName="dnsmasq-dns" containerID="cri-o://40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d" gracePeriod=10 Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.035210 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.135906 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-config\") pod \"10777956-1f18-4bd0-9790-aae020334e2c\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.136046 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28bxt\" (UniqueName: \"kubernetes.io/projected/10777956-1f18-4bd0-9790-aae020334e2c-kube-api-access-28bxt\") pod \"10777956-1f18-4bd0-9790-aae020334e2c\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.136086 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-ovsdbserver-nb\") pod \"10777956-1f18-4bd0-9790-aae020334e2c\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.136108 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-dns-svc\") pod \"10777956-1f18-4bd0-9790-aae020334e2c\" (UID: \"10777956-1f18-4bd0-9790-aae020334e2c\") " Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.146228 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10777956-1f18-4bd0-9790-aae020334e2c-kube-api-access-28bxt" (OuterVolumeSpecName: "kube-api-access-28bxt") pod "10777956-1f18-4bd0-9790-aae020334e2c" (UID: "10777956-1f18-4bd0-9790-aae020334e2c"). InnerVolumeSpecName "kube-api-access-28bxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.172342 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10777956-1f18-4bd0-9790-aae020334e2c" (UID: "10777956-1f18-4bd0-9790-aae020334e2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.174416 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10777956-1f18-4bd0-9790-aae020334e2c" (UID: "10777956-1f18-4bd0-9790-aae020334e2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.187050 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-config" (OuterVolumeSpecName: "config") pod "10777956-1f18-4bd0-9790-aae020334e2c" (UID: "10777956-1f18-4bd0-9790-aae020334e2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.237957 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.237998 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28bxt\" (UniqueName: \"kubernetes.io/projected/10777956-1f18-4bd0-9790-aae020334e2c-kube-api-access-28bxt\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.238010 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.238018 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10777956-1f18-4bd0-9790-aae020334e2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.389202 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mszqk" event={"ID":"b4a45633-8ac7-497f-a8d7-2b7a3dad35bc","Type":"ContainerStarted","Data":"1ca41fb588e43dfa39c2e3e61a6ab9cc617716eec43113b0c1873a7e2ca9e13a"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.392691 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1ee00086-3c8a-4f3a-a5d5-9590715a8b95","Type":"ContainerStarted","Data":"499f0c68d499ee45ab69e4ac637942df91eb78121e1d774bbabb9618a9b25fff"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.396080 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"94d38c16-a6c9-44ed-a49e-398dc34b92ce","Type":"ContainerStarted","Data":"ea2fbdc081b0d7772aa49fd0d249576d8ae27495d6fc4b3dc4b82f68f67285ba"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.399068 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c3288572-a9dc-4f96-8535-b05f6f22855b","Type":"ContainerStarted","Data":"4281dca9007224e780bd7011e967ed1045d9d28f90c61c7afe89ae71e702c2a1"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.401352 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8926ccac-b553-4e37-bbcb-96e3b00c1cab","Type":"ContainerStarted","Data":"4cd57bc1175117be13d5d5343e5843a30c935b22b6f485dc1a31a363cd921e85"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.403289 4926 generic.go:334] "Generic (PLEG): container finished" podID="10777956-1f18-4bd0-9790-aae020334e2c" containerID="40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d" exitCode=0 Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.403327 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" event={"ID":"10777956-1f18-4bd0-9790-aae020334e2c","Type":"ContainerDied","Data":"40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.403348 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" event={"ID":"10777956-1f18-4bd0-9790-aae020334e2c","Type":"ContainerDied","Data":"1fc31c043cd931246845186f093dded25733b7a182735715b314acd79d3a77b3"} Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.403365 4926 scope.go:117] "RemoveContainer" containerID="40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.403492 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-6csg8" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.409417 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mszqk" podStartSLOduration=10.068310143 podStartE2EDuration="17.40939882s" podCreationTimestamp="2026-03-12 18:21:33 +0000 UTC" firstStartedPulling="2026-03-12 18:21:42.146159943 +0000 UTC m=+1142.514786276" lastFinishedPulling="2026-03-12 18:21:49.48724862 +0000 UTC m=+1149.855874953" observedRunningTime="2026-03-12 18:21:50.408075729 +0000 UTC m=+1150.776702082" watchObservedRunningTime="2026-03-12 18:21:50.40939882 +0000 UTC m=+1150.778025163" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.441907 4926 scope.go:117] "RemoveContainer" containerID="07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.454878 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.909324721 podStartE2EDuration="33.454850354s" podCreationTimestamp="2026-03-12 18:21:17 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.111142431 +0000 UTC m=+1131.479768764" lastFinishedPulling="2026-03-12 18:21:41.656668064 +0000 UTC m=+1142.025294397" observedRunningTime="2026-03-12 18:21:50.445935947 +0000 UTC m=+1150.814562310" watchObservedRunningTime="2026-03-12 18:21:50.454850354 +0000 UTC m=+1150.823476697" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.474094 4926 scope.go:117] "RemoveContainer" containerID="40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d" Mar 12 18:21:50 crc kubenswrapper[4926]: E0312 18:21:50.474813 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d\": container with ID starting with 40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d not found: ID does not exist" containerID="40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.474843 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d"} err="failed to get container status \"40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d\": rpc error: code = NotFound desc = could not find container \"40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d\": container with ID starting with 40485e5bf2b0729b4775af9e0b049a41636e31956c43cb5591bee9a26f2eef0d not found: ID does not exist" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.474862 4926 scope.go:117] "RemoveContainer" containerID="07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586" Mar 12 18:21:50 crc kubenswrapper[4926]: E0312 18:21:50.475140 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586\": container with ID starting with 07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586 not found: ID does not exist" containerID="07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.475160 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586"} err="failed to get container status \"07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586\": rpc error: code = NotFound desc = could not find container \"07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586\": container with ID starting with 07e4167e25a44dc3413aab2abedb803e49deb6627acac90612e2aefe927ed586 not found: ID does not exist" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.498810 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.720888814 podStartE2EDuration="22.498782421s" podCreationTimestamp="2026-03-12 18:21:28 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.702529921 +0000 UTC m=+1132.071156254" lastFinishedPulling="2026-03-12 18:21:49.480423528 +0000 UTC m=+1149.849049861" observedRunningTime="2026-03-12 18:21:50.475045512 +0000 UTC m=+1150.843671845" watchObservedRunningTime="2026-03-12 18:21:50.498782421 +0000 UTC m=+1150.867408764" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.535369 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.996014239 podStartE2EDuration="32.535345289s" podCreationTimestamp="2026-03-12 18:21:18 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.223234849 +0000 UTC m=+1131.591861182" lastFinishedPulling="2026-03-12 18:21:41.762565899 +0000 UTC m=+1142.131192232" observedRunningTime="2026-03-12 18:21:50.516907134 +0000 UTC m=+1150.885533477" watchObservedRunningTime="2026-03-12 18:21:50.535345289 +0000 UTC m=+1150.903971622" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.559646 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.630391 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.526860908 podStartE2EDuration="26.630376115s" podCreationTimestamp="2026-03-12 18:21:24 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.380280165 +0000 UTC m=+1131.748906498" lastFinishedPulling="2026-03-12 18:21:49.483795372 +0000 UTC m=+1149.852421705" observedRunningTime="2026-03-12 18:21:50.594826069 +0000 UTC m=+1150.963452422" watchObservedRunningTime="2026-03-12 18:21:50.630376115 +0000 UTC m=+1150.999002448" Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.657782 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6csg8"] Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.664599 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-6csg8"] Mar 12 18:21:50 crc kubenswrapper[4926]: I0312 18:21:50.972838 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:51 crc kubenswrapper[4926]: I0312 18:21:51.010091 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:51 crc kubenswrapper[4926]: I0312 18:21:51.274540 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:51 crc kubenswrapper[4926]: I0312 18:21:51.411902 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:51 crc kubenswrapper[4926]: I0312 18:21:51.461287 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.499605 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10777956-1f18-4bd0-9790-aae020334e2c" path="/var/lib/kubelet/pods/10777956-1f18-4bd0-9790-aae020334e2c/volumes" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692172 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-7vm4l"] Mar 12 18:21:52 crc kubenswrapper[4926]: E0312 18:21:52.692544 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10777956-1f18-4bd0-9790-aae020334e2c" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692565 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="10777956-1f18-4bd0-9790-aae020334e2c" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: E0312 18:21:52.692595 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692601 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: E0312 18:21:52.692617 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="init" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692623 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="init" Mar 12 18:21:52 crc kubenswrapper[4926]: E0312 18:21:52.692639 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692645 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: E0312 18:21:52.692660 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="init" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692667 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="init" Mar 12 18:21:52 crc kubenswrapper[4926]: E0312 18:21:52.692684 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10777956-1f18-4bd0-9790-aae020334e2c" containerName="init" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692690 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="10777956-1f18-4bd0-9790-aae020334e2c" containerName="init" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692835 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e12f792-0fbe-4996-a6ec-272e5108dd33" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692852 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="10777956-1f18-4bd0-9790-aae020334e2c" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.692865 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5241fdef-63ed-416f-9bf0-c004597cc099" containerName="dnsmasq-dns" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.700497 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.718513 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.739536 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7vm4l"] Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.784163 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-dns-svc\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.784263 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-config\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.784314 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv46v\" (UniqueName: \"kubernetes.io/projected/f569f4cb-b487-41d8-bab4-5c2d7aba2219-kube-api-access-vv46v\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.784345 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.784703 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.886915 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv46v\" (UniqueName: \"kubernetes.io/projected/f569f4cb-b487-41d8-bab4-5c2d7aba2219-kube-api-access-vv46v\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.886992 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.887067 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.887093 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-dns-svc\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.887134 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-config\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.887847 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-config\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.888651 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.889268 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.889812 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-dns-svc\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:52 crc kubenswrapper[4926]: I0312 18:21:52.913207 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv46v\" (UniqueName: \"kubernetes.io/projected/f569f4cb-b487-41d8-bab4-5c2d7aba2219-kube-api-access-vv46v\") pod \"dnsmasq-dns-698758b865-7vm4l\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.018641 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.279024 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.333362 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.471402 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.573503 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7vm4l"] Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.636633 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.638006 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.646047 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-pvx7j" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.646236 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.646524 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.647257 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.664151 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799259 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-scripts\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799329 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799460 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-config\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799493 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799525 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799599 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.799627 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gmzl\" (UniqueName: \"kubernetes.io/projected/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-kube-api-access-6gmzl\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.863573 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.869414 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.871930 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.872085 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.872755 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.876320 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lnfrb" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.907584 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.909937 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-config\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.909999 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.910049 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.910101 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.910133 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gmzl\" (UniqueName: \"kubernetes.io/projected/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-kube-api-access-6gmzl\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.910150 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-scripts\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.910914 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-scripts\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.913291 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-config\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.913591 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.929395 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.932216 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.933730 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.950787 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.953701 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gmzl\" (UniqueName: \"kubernetes.io/projected/3ffbc2f0-cbec-43d4-9907-a95af037ae1b-kube-api-access-6gmzl\") pod \"ovn-northd-0\" (UID: \"3ffbc2f0-cbec-43d4-9907-a95af037ae1b\") " pod="openstack/ovn-northd-0" Mar 12 18:21:53 crc kubenswrapper[4926]: I0312 18:21:53.975971 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.012267 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57853681-32de-4475-9c7d-3f9708fe7d91-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.012325 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7nq\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-kube-api-access-ts7nq\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.012348 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/57853681-32de-4475-9c7d-3f9708fe7d91-lock\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.012379 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/57853681-32de-4475-9c7d-3f9708fe7d91-cache\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.012403 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.012460 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.114302 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/57853681-32de-4475-9c7d-3f9708fe7d91-cache\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.114702 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.114734 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.114796 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57853681-32de-4475-9c7d-3f9708fe7d91-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.114832 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7nq\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-kube-api-access-ts7nq\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.114850 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/57853681-32de-4475-9c7d-3f9708fe7d91-lock\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.115451 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/57853681-32de-4475-9c7d-3f9708fe7d91-lock\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.115673 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/57853681-32de-4475-9c7d-3f9708fe7d91-cache\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.115908 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: E0312 18:21:54.117997 4926 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 18:21:54 crc kubenswrapper[4926]: E0312 18:21:54.118036 4926 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 18:21:54 crc kubenswrapper[4926]: E0312 18:21:54.118193 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift podName:57853681-32de-4475-9c7d-3f9708fe7d91 nodeName:}" failed. No retries permitted until 2026-03-12 18:21:54.618083775 +0000 UTC m=+1154.986710158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift") pod "swift-storage-0" (UID: "57853681-32de-4475-9c7d-3f9708fe7d91") : configmap "swift-ring-files" not found Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.124146 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57853681-32de-4475-9c7d-3f9708fe7d91-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.138031 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7nq\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-kube-api-access-ts7nq\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.147866 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.433260 4926 generic.go:334] "Generic (PLEG): container finished" podID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerID="50524e634e7b1a90622cd86f4a3bdca7cea0ea5d1a412cda03a7f2f2cce56ce2" exitCode=0 Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.433467 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7vm4l" event={"ID":"f569f4cb-b487-41d8-bab4-5c2d7aba2219","Type":"ContainerDied","Data":"50524e634e7b1a90622cd86f4a3bdca7cea0ea5d1a412cda03a7f2f2cce56ce2"} Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.433518 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7vm4l" event={"ID":"f569f4cb-b487-41d8-bab4-5c2d7aba2219","Type":"ContainerStarted","Data":"1a2ca21647a28c29a2f5caa87bf2be645c7eb58039451905ac1f8540e674ede0"} Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.501048 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 18:21:54 crc kubenswrapper[4926]: W0312 18:21:54.609006 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ffbc2f0_cbec_43d4_9907_a95af037ae1b.slice/crio-256178c2d03d737a1f12f6ff6236c2beeb58150128480f3343ee00ab2765e9b1 WatchSource:0}: Error finding container 256178c2d03d737a1f12f6ff6236c2beeb58150128480f3343ee00ab2765e9b1: Status 404 returned error can't find the container with id 256178c2d03d737a1f12f6ff6236c2beeb58150128480f3343ee00ab2765e9b1 Mar 12 18:21:54 crc kubenswrapper[4926]: I0312 18:21:54.621685 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:54 crc kubenswrapper[4926]: E0312 18:21:54.622154 4926 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 18:21:54 crc kubenswrapper[4926]: E0312 18:21:54.623228 4926 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 18:21:54 crc kubenswrapper[4926]: E0312 18:21:54.623284 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift podName:57853681-32de-4475-9c7d-3f9708fe7d91 nodeName:}" failed. No retries permitted until 2026-03-12 18:21:55.623266093 +0000 UTC m=+1155.991892446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift") pod "swift-storage-0" (UID: "57853681-32de-4475-9c7d-3f9708fe7d91") : configmap "swift-ring-files" not found Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.444203 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7vm4l" event={"ID":"f569f4cb-b487-41d8-bab4-5c2d7aba2219","Type":"ContainerStarted","Data":"633289debf3f6b1a2f62e3d378805f919d6393fc63c3e3559b1339530737acbe"} Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.444662 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.445548 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3ffbc2f0-cbec-43d4-9907-a95af037ae1b","Type":"ContainerStarted","Data":"256178c2d03d737a1f12f6ff6236c2beeb58150128480f3343ee00ab2765e9b1"} Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.448029 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr" event={"ID":"8f19fbb7-ea3b-437a-a634-498e6a593ef6","Type":"ContainerStarted","Data":"29225e5590e6cbe2eb3df76009321f8119b0d09e042cdfc6354903e1c59fcfc0"} Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.448376 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sfwpr" Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.495963 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sfwpr" podStartSLOduration=7.094639749 podStartE2EDuration="30.495946653s" podCreationTimestamp="2026-03-12 18:21:25 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.249857957 +0000 UTC m=+1131.618484290" lastFinishedPulling="2026-03-12 18:21:54.651164861 +0000 UTC m=+1155.019791194" observedRunningTime="2026-03-12 18:21:55.489373608 +0000 UTC m=+1155.857999941" watchObservedRunningTime="2026-03-12 18:21:55.495946653 +0000 UTC m=+1155.864572996" Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.499549 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-7vm4l" podStartSLOduration=3.499544755 podStartE2EDuration="3.499544755s" podCreationTimestamp="2026-03-12 18:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:21:55.471838133 +0000 UTC m=+1155.840464466" watchObservedRunningTime="2026-03-12 18:21:55.499544755 +0000 UTC m=+1155.868171088" Mar 12 18:21:55 crc kubenswrapper[4926]: I0312 18:21:55.638002 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:55 crc kubenswrapper[4926]: E0312 18:21:55.638246 4926 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 18:21:55 crc kubenswrapper[4926]: E0312 18:21:55.638268 4926 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 18:21:55 crc kubenswrapper[4926]: E0312 18:21:55.638324 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift podName:57853681-32de-4475-9c7d-3f9708fe7d91 nodeName:}" failed. No retries permitted until 2026-03-12 18:21:57.638305022 +0000 UTC m=+1158.006931355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift") pod "swift-storage-0" (UID: "57853681-32de-4475-9c7d-3f9708fe7d91") : configmap "swift-ring-files" not found Mar 12 18:21:56 crc kubenswrapper[4926]: I0312 18:21:56.456586 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3ffbc2f0-cbec-43d4-9907-a95af037ae1b","Type":"ContainerStarted","Data":"1cb27375fd5a2f047a3e9c961b0ae084ca462a4c2adacc72740c9e181bac727c"} Mar 12 18:21:56 crc kubenswrapper[4926]: I0312 18:21:56.456880 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3ffbc2f0-cbec-43d4-9907-a95af037ae1b","Type":"ContainerStarted","Data":"98ef7afd04dcaf2ad4757cd1e70abbc89963107563e73d273da801e0cb8d6fcc"} Mar 12 18:21:56 crc kubenswrapper[4926]: I0312 18:21:56.457287 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 12 18:21:56 crc kubenswrapper[4926]: I0312 18:21:56.495955 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.44122957 podStartE2EDuration="3.495933645s" podCreationTimestamp="2026-03-12 18:21:53 +0000 UTC" firstStartedPulling="2026-03-12 18:21:54.612120145 +0000 UTC m=+1154.980746498" lastFinishedPulling="2026-03-12 18:21:55.66682424 +0000 UTC m=+1156.035450573" observedRunningTime="2026-03-12 18:21:56.481841687 +0000 UTC m=+1156.850468040" watchObservedRunningTime="2026-03-12 18:21:56.495933645 +0000 UTC m=+1156.864559978" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.680782 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:21:57 crc kubenswrapper[4926]: E0312 18:21:57.680964 4926 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 18:21:57 crc kubenswrapper[4926]: E0312 18:21:57.681271 4926 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 18:21:57 crc kubenswrapper[4926]: E0312 18:21:57.681332 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift podName:57853681-32de-4475-9c7d-3f9708fe7d91 nodeName:}" failed. No retries permitted until 2026-03-12 18:22:01.681308884 +0000 UTC m=+1162.049935297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift") pod "swift-storage-0" (UID: "57853681-32de-4475-9c7d-3f9708fe7d91") : configmap "swift-ring-files" not found Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.780925 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bn6gf"] Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.782551 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.784858 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.785942 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.791571 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.793732 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bn6gf"] Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.884896 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-ring-data-devices\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.884972 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-etc-swift\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.885029 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bg7n\" (UniqueName: \"kubernetes.io/projected/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-kube-api-access-6bg7n\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.885132 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-scripts\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.885171 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-combined-ca-bundle\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.885260 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-dispersionconf\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.885346 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-swiftconf\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987595 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-scripts\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987671 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-combined-ca-bundle\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987741 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-dispersionconf\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987823 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-swiftconf\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987895 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-ring-data-devices\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987930 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-etc-swift\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.987969 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bg7n\" (UniqueName: \"kubernetes.io/projected/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-kube-api-access-6bg7n\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.988604 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-etc-swift\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.988840 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-scripts\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.988857 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-ring-data-devices\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.994379 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-dispersionconf\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.994570 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-swiftconf\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:57 crc kubenswrapper[4926]: I0312 18:21:57.996719 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-combined-ca-bundle\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:58 crc kubenswrapper[4926]: I0312 18:21:58.019415 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bg7n\" (UniqueName: \"kubernetes.io/projected/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-kube-api-access-6bg7n\") pod \"swift-ring-rebalance-bn6gf\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:58 crc kubenswrapper[4926]: I0312 18:21:58.100008 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:21:58 crc kubenswrapper[4926]: I0312 18:21:58.661907 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 12 18:21:58 crc kubenswrapper[4926]: I0312 18:21:58.662225 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 12 18:21:58 crc kubenswrapper[4926]: I0312 18:21:58.670397 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bn6gf"] Mar 12 18:21:58 crc kubenswrapper[4926]: I0312 18:21:58.750183 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 12 18:21:59 crc kubenswrapper[4926]: I0312 18:21:59.480832 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bn6gf" event={"ID":"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f","Type":"ContainerStarted","Data":"beae81a2d2cc5a26e427094e9b94a43594531b558df0f2e26a7a5160b7c22dda"} Mar 12 18:21:59 crc kubenswrapper[4926]: I0312 18:21:59.549141 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.049495 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.049528 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.130061 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555662-wmwgq"] Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.131208 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.133534 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.134471 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.134605 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.136505 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555662-wmwgq"] Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.139093 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.229225 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9pnj\" (UniqueName: \"kubernetes.io/projected/3def1951-b6f9-4621-8428-b3e169e34279-kube-api-access-l9pnj\") pod \"auto-csr-approver-29555662-wmwgq\" (UID: \"3def1951-b6f9-4621-8428-b3e169e34279\") " pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.330285 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9pnj\" (UniqueName: \"kubernetes.io/projected/3def1951-b6f9-4621-8428-b3e169e34279-kube-api-access-l9pnj\") pod \"auto-csr-approver-29555662-wmwgq\" (UID: \"3def1951-b6f9-4621-8428-b3e169e34279\") " pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.349041 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9pnj\" (UniqueName: \"kubernetes.io/projected/3def1951-b6f9-4621-8428-b3e169e34279-kube-api-access-l9pnj\") pod \"auto-csr-approver-29555662-wmwgq\" (UID: \"3def1951-b6f9-4621-8428-b3e169e34279\") " pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.453728 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.589156 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.851252 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-15a0-account-create-update-b4wdv"] Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.860039 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-15a0-account-create-update-b4wdv"] Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.860126 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.862095 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.901170 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kslsb"] Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.902307 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kslsb" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.916229 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kslsb"] Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.941909 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrl49\" (UniqueName: \"kubernetes.io/projected/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-kube-api-access-jrl49\") pod \"glance-15a0-account-create-update-b4wdv\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:00 crc kubenswrapper[4926]: I0312 18:22:00.942104 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-operator-scripts\") pod \"glance-15a0-account-create-update-b4wdv\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.043386 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-operator-scripts\") pod \"glance-15a0-account-create-update-b4wdv\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.043845 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb8a8892-82ee-4502-b76d-ca289485809b-operator-scripts\") pod \"glance-db-create-kslsb\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.044104 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gfv\" (UniqueName: \"kubernetes.io/projected/eb8a8892-82ee-4502-b76d-ca289485809b-kube-api-access-v2gfv\") pod \"glance-db-create-kslsb\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.044377 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrl49\" (UniqueName: \"kubernetes.io/projected/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-kube-api-access-jrl49\") pod \"glance-15a0-account-create-update-b4wdv\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.044184 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-operator-scripts\") pod \"glance-15a0-account-create-update-b4wdv\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.065278 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrl49\" (UniqueName: \"kubernetes.io/projected/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-kube-api-access-jrl49\") pod \"glance-15a0-account-create-update-b4wdv\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.145773 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb8a8892-82ee-4502-b76d-ca289485809b-operator-scripts\") pod \"glance-db-create-kslsb\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.145850 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gfv\" (UniqueName: \"kubernetes.io/projected/eb8a8892-82ee-4502-b76d-ca289485809b-kube-api-access-v2gfv\") pod \"glance-db-create-kslsb\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.147022 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb8a8892-82ee-4502-b76d-ca289485809b-operator-scripts\") pod \"glance-db-create-kslsb\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.163096 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gfv\" (UniqueName: \"kubernetes.io/projected/eb8a8892-82ee-4502-b76d-ca289485809b-kube-api-access-v2gfv\") pod \"glance-db-create-kslsb\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.177933 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.228669 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kslsb" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.487992 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gk626"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.489390 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.496233 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gk626"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.591665 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c596-account-create-update-kw2ql"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.592986 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.595425 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.605350 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c596-account-create-update-kw2ql"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.653130 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-operator-scripts\") pod \"keystone-db-create-gk626\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.654590 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k276h\" (UniqueName: \"kubernetes.io/projected/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-kube-api-access-k276h\") pod \"keystone-db-create-gk626\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.734203 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-87jsw"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.735344 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-87jsw" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.752796 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-87jsw"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.756400 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k276h\" (UniqueName: \"kubernetes.io/projected/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-kube-api-access-k276h\") pod \"keystone-db-create-gk626\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.756474 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.756500 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-operator-scripts\") pod \"keystone-db-create-gk626\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.756571 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e559975-5aca-457e-8c50-465552595381-operator-scripts\") pod \"keystone-c596-account-create-update-kw2ql\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.756601 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjz72\" (UniqueName: \"kubernetes.io/projected/1e559975-5aca-457e-8c50-465552595381-kube-api-access-mjz72\") pod \"keystone-c596-account-create-update-kw2ql\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: E0312 18:22:01.756864 4926 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 18:22:01 crc kubenswrapper[4926]: E0312 18:22:01.756894 4926 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 18:22:01 crc kubenswrapper[4926]: E0312 18:22:01.756979 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift podName:57853681-32de-4475-9c7d-3f9708fe7d91 nodeName:}" failed. No retries permitted until 2026-03-12 18:22:09.756961996 +0000 UTC m=+1170.125588329 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift") pod "swift-storage-0" (UID: "57853681-32de-4475-9c7d-3f9708fe7d91") : configmap "swift-ring-files" not found Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.757426 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-operator-scripts\") pod \"keystone-db-create-gk626\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.804343 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k276h\" (UniqueName: \"kubernetes.io/projected/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-kube-api-access-k276h\") pod \"keystone-db-create-gk626\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.809641 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b886-account-create-update-sdqf6"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.811254 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gk626" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.814756 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.817871 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.840601 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b886-account-create-update-sdqf6"] Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.878673 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26s4h\" (UniqueName: \"kubernetes.io/projected/ce8493fd-3e35-41fb-8daa-febd2238ce1b-kube-api-access-26s4h\") pod \"placement-db-create-87jsw\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " pod="openstack/placement-db-create-87jsw" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.878903 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e559975-5aca-457e-8c50-465552595381-operator-scripts\") pod \"keystone-c596-account-create-update-kw2ql\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.879047 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjz72\" (UniqueName: \"kubernetes.io/projected/1e559975-5aca-457e-8c50-465552595381-kube-api-access-mjz72\") pod \"keystone-c596-account-create-update-kw2ql\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.879391 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8493fd-3e35-41fb-8daa-febd2238ce1b-operator-scripts\") pod \"placement-db-create-87jsw\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " pod="openstack/placement-db-create-87jsw" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.879644 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e559975-5aca-457e-8c50-465552595381-operator-scripts\") pod \"keystone-c596-account-create-update-kw2ql\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.900913 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjz72\" (UniqueName: \"kubernetes.io/projected/1e559975-5aca-457e-8c50-465552595381-kube-api-access-mjz72\") pod \"keystone-c596-account-create-update-kw2ql\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.912413 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.980744 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f98d642-e5f2-44de-9259-15b5eed6b80c-operator-scripts\") pod \"placement-b886-account-create-update-sdqf6\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.980797 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8brwx\" (UniqueName: \"kubernetes.io/projected/3f98d642-e5f2-44de-9259-15b5eed6b80c-kube-api-access-8brwx\") pod \"placement-b886-account-create-update-sdqf6\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.980967 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8493fd-3e35-41fb-8daa-febd2238ce1b-operator-scripts\") pod \"placement-db-create-87jsw\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " pod="openstack/placement-db-create-87jsw" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.981154 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26s4h\" (UniqueName: \"kubernetes.io/projected/ce8493fd-3e35-41fb-8daa-febd2238ce1b-kube-api-access-26s4h\") pod \"placement-db-create-87jsw\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " pod="openstack/placement-db-create-87jsw" Mar 12 18:22:01 crc kubenswrapper[4926]: I0312 18:22:01.981789 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8493fd-3e35-41fb-8daa-febd2238ce1b-operator-scripts\") pod \"placement-db-create-87jsw\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " pod="openstack/placement-db-create-87jsw" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.011411 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26s4h\" (UniqueName: \"kubernetes.io/projected/ce8493fd-3e35-41fb-8daa-febd2238ce1b-kube-api-access-26s4h\") pod \"placement-db-create-87jsw\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " pod="openstack/placement-db-create-87jsw" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.057118 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-87jsw" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.086032 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f98d642-e5f2-44de-9259-15b5eed6b80c-operator-scripts\") pod \"placement-b886-account-create-update-sdqf6\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.086095 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8brwx\" (UniqueName: \"kubernetes.io/projected/3f98d642-e5f2-44de-9259-15b5eed6b80c-kube-api-access-8brwx\") pod \"placement-b886-account-create-update-sdqf6\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.087863 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f98d642-e5f2-44de-9259-15b5eed6b80c-operator-scripts\") pod \"placement-b886-account-create-update-sdqf6\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.105686 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8brwx\" (UniqueName: \"kubernetes.io/projected/3f98d642-e5f2-44de-9259-15b5eed6b80c-kube-api-access-8brwx\") pod \"placement-b886-account-create-update-sdqf6\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:02 crc kubenswrapper[4926]: I0312 18:22:02.136079 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.023641 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.129870 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-htjb7"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.130133 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerName="dnsmasq-dns" containerID="cri-o://bc88a073a21581ac4854225ad8d64ba64f0d8303fd8d746c072ccb3cc21dc625" gracePeriod=10 Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.466028 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kslsb"] Mar 12 18:22:03 crc kubenswrapper[4926]: W0312 18:22:03.543903 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3def1951_b6f9_4621_8428_b3e169e34279.slice/crio-9995b1f578722c4da205a824f7fc383db43c0b559444cc089533ad24be578f7c WatchSource:0}: Error finding container 9995b1f578722c4da205a824f7fc383db43c0b559444cc089533ad24be578f7c: Status 404 returned error can't find the container with id 9995b1f578722c4da205a824f7fc383db43c0b559444cc089533ad24be578f7c Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.546393 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555662-wmwgq"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.548201 4926 generic.go:334] "Generic (PLEG): container finished" podID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerID="bc88a073a21581ac4854225ad8d64ba64f0d8303fd8d746c072ccb3cc21dc625" exitCode=0 Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.548312 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" event={"ID":"7919b4ef-74fc-472e-9a5e-04216cc51ae5","Type":"ContainerDied","Data":"bc88a073a21581ac4854225ad8d64ba64f0d8303fd8d746c072ccb3cc21dc625"} Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.550343 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bn6gf" event={"ID":"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f","Type":"ContainerStarted","Data":"251cc81f683a75e3d21cb79796d4e3d3d467bddd70539c194beb7a011504559b"} Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.552932 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kslsb" event={"ID":"eb8a8892-82ee-4502-b76d-ca289485809b","Type":"ContainerStarted","Data":"00b76c7be6d0fdb000f31c1e38cf50ca0b2d1f46c57eb45e800949b818b564a5"} Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.560565 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c596-account-create-update-kw2ql"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.579086 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-bn6gf" podStartSLOduration=2.513716254 podStartE2EDuration="6.579068886s" podCreationTimestamp="2026-03-12 18:21:57 +0000 UTC" firstStartedPulling="2026-03-12 18:21:58.679219611 +0000 UTC m=+1159.047845944" lastFinishedPulling="2026-03-12 18:22:02.744572243 +0000 UTC m=+1163.113198576" observedRunningTime="2026-03-12 18:22:03.566070501 +0000 UTC m=+1163.934696834" watchObservedRunningTime="2026-03-12 18:22:03.579068886 +0000 UTC m=+1163.947695219" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.652553 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.760967 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-15a0-account-create-update-b4wdv"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.772518 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-87jsw"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.782270 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b886-account-create-update-sdqf6"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.786694 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gk626"] Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.828228 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-config\") pod \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.828427 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-sb\") pod \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.828477 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7c2g\" (UniqueName: \"kubernetes.io/projected/7919b4ef-74fc-472e-9a5e-04216cc51ae5-kube-api-access-l7c2g\") pod \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.828503 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-dns-svc\") pod \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.828544 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-nb\") pod \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\" (UID: \"7919b4ef-74fc-472e-9a5e-04216cc51ae5\") " Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.845408 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7919b4ef-74fc-472e-9a5e-04216cc51ae5-kube-api-access-l7c2g" (OuterVolumeSpecName: "kube-api-access-l7c2g") pod "7919b4ef-74fc-472e-9a5e-04216cc51ae5" (UID: "7919b4ef-74fc-472e-9a5e-04216cc51ae5"). InnerVolumeSpecName "kube-api-access-l7c2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.888703 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7919b4ef-74fc-472e-9a5e-04216cc51ae5" (UID: "7919b4ef-74fc-472e-9a5e-04216cc51ae5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.931763 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.931836 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7c2g\" (UniqueName: \"kubernetes.io/projected/7919b4ef-74fc-472e-9a5e-04216cc51ae5-kube-api-access-l7c2g\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.932931 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7919b4ef-74fc-472e-9a5e-04216cc51ae5" (UID: "7919b4ef-74fc-472e-9a5e-04216cc51ae5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.933057 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7919b4ef-74fc-472e-9a5e-04216cc51ae5" (UID: "7919b4ef-74fc-472e-9a5e-04216cc51ae5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:03 crc kubenswrapper[4926]: I0312 18:22:03.943975 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-config" (OuterVolumeSpecName: "config") pod "7919b4ef-74fc-472e-9a5e-04216cc51ae5" (UID: "7919b4ef-74fc-472e-9a5e-04216cc51ae5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.033172 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.033202 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.033216 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7919b4ef-74fc-472e-9a5e-04216cc51ae5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.565571 4926 generic.go:334] "Generic (PLEG): container finished" podID="ce8493fd-3e35-41fb-8daa-febd2238ce1b" containerID="0477440cc1caee314ece47874cba75dc234908b5f0302ff1737f87d526dec4a4" exitCode=0 Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.565733 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-87jsw" event={"ID":"ce8493fd-3e35-41fb-8daa-febd2238ce1b","Type":"ContainerDied","Data":"0477440cc1caee314ece47874cba75dc234908b5f0302ff1737f87d526dec4a4"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.565826 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-87jsw" event={"ID":"ce8493fd-3e35-41fb-8daa-febd2238ce1b","Type":"ContainerStarted","Data":"2fffaad6d5a06f655ff4c8d188a160c681cfc6e69dc4ef7ffcfa90f4d6f269b4"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.568090 4926 generic.go:334] "Generic (PLEG): container finished" podID="1e559975-5aca-457e-8c50-465552595381" containerID="ed6d1abff4f7da34f442e943b17a81aeae55a22054d604db8a2f0c813a7bb890" exitCode=0 Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.568135 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c596-account-create-update-kw2ql" event={"ID":"1e559975-5aca-457e-8c50-465552595381","Type":"ContainerDied","Data":"ed6d1abff4f7da34f442e943b17a81aeae55a22054d604db8a2f0c813a7bb890"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.568152 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c596-account-create-update-kw2ql" event={"ID":"1e559975-5aca-457e-8c50-465552595381","Type":"ContainerStarted","Data":"12f3735d737a130d7a9095b30d8233ba764f6de01a5c6fe30de6758919c90b92"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.570553 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b886-account-create-update-sdqf6" event={"ID":"3f98d642-e5f2-44de-9259-15b5eed6b80c","Type":"ContainerStarted","Data":"2ba173648181598e4717e6ee2a92ecb7ee6351d447d5a38a4302f5af077322b5"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.570582 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b886-account-create-update-sdqf6" event={"ID":"3f98d642-e5f2-44de-9259-15b5eed6b80c","Type":"ContainerStarted","Data":"2b00f6a8874e175c9a39535d556a6fbc2f2a3ee6efb123f9a4087969dcab6675"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.574825 4926 generic.go:334] "Generic (PLEG): container finished" podID="eb8a8892-82ee-4502-b76d-ca289485809b" containerID="d5aba66a32fd03fd9f7cad091c15aa0c8e52be10f9b3914b0869cd5e59465cde" exitCode=0 Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.574884 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kslsb" event={"ID":"eb8a8892-82ee-4502-b76d-ca289485809b","Type":"ContainerDied","Data":"d5aba66a32fd03fd9f7cad091c15aa0c8e52be10f9b3914b0869cd5e59465cde"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.581094 4926 generic.go:334] "Generic (PLEG): container finished" podID="51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" containerID="97dffad6ad390315746d80bb46b2150bb89f6d5359806877446913c809356199" exitCode=0 Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.581159 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gk626" event={"ID":"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313","Type":"ContainerDied","Data":"97dffad6ad390315746d80bb46b2150bb89f6d5359806877446913c809356199"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.581188 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gk626" event={"ID":"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313","Type":"ContainerStarted","Data":"2dfd9c4d7dc7e12afe66ccf426b7bea55f65eaade1defa6244ce76ada3b5ba5c"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.584833 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" event={"ID":"3def1951-b6f9-4621-8428-b3e169e34279","Type":"ContainerStarted","Data":"9995b1f578722c4da205a824f7fc383db43c0b559444cc089533ad24be578f7c"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.586952 4926 generic.go:334] "Generic (PLEG): container finished" podID="2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" containerID="7093c3070454c81ffcfc913032818ad12b03f50c370515a66252d6a038735742" exitCode=0 Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.587009 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-15a0-account-create-update-b4wdv" event={"ID":"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496","Type":"ContainerDied","Data":"7093c3070454c81ffcfc913032818ad12b03f50c370515a66252d6a038735742"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.587025 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-15a0-account-create-update-b4wdv" event={"ID":"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496","Type":"ContainerStarted","Data":"bf0bf6219a3827f9978a4d657d9f0c8902ab0058726263be468bd1610eca879f"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.591033 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.591565 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-htjb7" event={"ID":"7919b4ef-74fc-472e-9a5e-04216cc51ae5","Type":"ContainerDied","Data":"4cd40efc4978929f55aec279245d0d1733c0429e93e1c880bdcbc1f4f5fc9094"} Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.591651 4926 scope.go:117] "RemoveContainer" containerID="bc88a073a21581ac4854225ad8d64ba64f0d8303fd8d746c072ccb3cc21dc625" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.603220 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b886-account-create-update-sdqf6" podStartSLOduration=3.603199459 podStartE2EDuration="3.603199459s" podCreationTimestamp="2026-03-12 18:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:04.594194848 +0000 UTC m=+1164.962821191" watchObservedRunningTime="2026-03-12 18:22:04.603199459 +0000 UTC m=+1164.971825792" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.639167 4926 scope.go:117] "RemoveContainer" containerID="d971505f58d662cfcbae50ecf97fffad61988b03d55fb8228578b1f86349cea3" Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.692117 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-htjb7"] Mar 12 18:22:04 crc kubenswrapper[4926]: I0312 18:22:04.698318 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-htjb7"] Mar 12 18:22:05 crc kubenswrapper[4926]: I0312 18:22:05.602534 4926 generic.go:334] "Generic (PLEG): container finished" podID="3f98d642-e5f2-44de-9259-15b5eed6b80c" containerID="2ba173648181598e4717e6ee2a92ecb7ee6351d447d5a38a4302f5af077322b5" exitCode=0 Mar 12 18:22:05 crc kubenswrapper[4926]: I0312 18:22:05.602605 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b886-account-create-update-sdqf6" event={"ID":"3f98d642-e5f2-44de-9259-15b5eed6b80c","Type":"ContainerDied","Data":"2ba173648181598e4717e6ee2a92ecb7ee6351d447d5a38a4302f5af077322b5"} Mar 12 18:22:05 crc kubenswrapper[4926]: I0312 18:22:05.604157 4926 generic.go:334] "Generic (PLEG): container finished" podID="3def1951-b6f9-4621-8428-b3e169e34279" containerID="6e11e8ecbd719cce51e25090959e2c12962023683d089714c24c4c379fbeaa75" exitCode=0 Mar 12 18:22:05 crc kubenswrapper[4926]: I0312 18:22:05.604607 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" event={"ID":"3def1951-b6f9-4621-8428-b3e169e34279","Type":"ContainerDied","Data":"6e11e8ecbd719cce51e25090959e2c12962023683d089714c24c4c379fbeaa75"} Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.122006 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gk626" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.266647 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.277642 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.291063 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-operator-scripts\") pod \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.291166 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k276h\" (UniqueName: \"kubernetes.io/projected/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-kube-api-access-k276h\") pod \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\" (UID: \"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.291895 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" (UID: "51bf320d-e5c9-43c6-baf7-e1f2f9ee3313"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.305344 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-kube-api-access-k276h" (OuterVolumeSpecName: "kube-api-access-k276h") pod "51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" (UID: "51bf320d-e5c9-43c6-baf7-e1f2f9ee3313"). InnerVolumeSpecName "kube-api-access-k276h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.311432 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kslsb" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.347317 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-87jsw" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.392338 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e559975-5aca-457e-8c50-465552595381-operator-scripts\") pod \"1e559975-5aca-457e-8c50-465552595381\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.392481 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrl49\" (UniqueName: \"kubernetes.io/projected/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-kube-api-access-jrl49\") pod \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.392527 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjz72\" (UniqueName: \"kubernetes.io/projected/1e559975-5aca-457e-8c50-465552595381-kube-api-access-mjz72\") pod \"1e559975-5aca-457e-8c50-465552595381\" (UID: \"1e559975-5aca-457e-8c50-465552595381\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.392630 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-operator-scripts\") pod \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\" (UID: \"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.392795 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e559975-5aca-457e-8c50-465552595381-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e559975-5aca-457e-8c50-465552595381" (UID: "1e559975-5aca-457e-8c50-465552595381"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.393176 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k276h\" (UniqueName: \"kubernetes.io/projected/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-kube-api-access-k276h\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.393201 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e559975-5aca-457e-8c50-465552595381-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.393213 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.393682 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" (UID: "2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.399991 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-kube-api-access-jrl49" (OuterVolumeSpecName: "kube-api-access-jrl49") pod "2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" (UID: "2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496"). InnerVolumeSpecName "kube-api-access-jrl49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.400102 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e559975-5aca-457e-8c50-465552595381-kube-api-access-mjz72" (OuterVolumeSpecName: "kube-api-access-mjz72") pod "1e559975-5aca-457e-8c50-465552595381" (UID: "1e559975-5aca-457e-8c50-465552595381"). InnerVolumeSpecName "kube-api-access-mjz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.494818 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26s4h\" (UniqueName: \"kubernetes.io/projected/ce8493fd-3e35-41fb-8daa-febd2238ce1b-kube-api-access-26s4h\") pod \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.494919 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb8a8892-82ee-4502-b76d-ca289485809b-operator-scripts\") pod \"eb8a8892-82ee-4502-b76d-ca289485809b\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495175 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8493fd-3e35-41fb-8daa-febd2238ce1b-operator-scripts\") pod \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\" (UID: \"ce8493fd-3e35-41fb-8daa-febd2238ce1b\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495220 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2gfv\" (UniqueName: \"kubernetes.io/projected/eb8a8892-82ee-4502-b76d-ca289485809b-kube-api-access-v2gfv\") pod \"eb8a8892-82ee-4502-b76d-ca289485809b\" (UID: \"eb8a8892-82ee-4502-b76d-ca289485809b\") " Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495682 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8493fd-3e35-41fb-8daa-febd2238ce1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce8493fd-3e35-41fb-8daa-febd2238ce1b" (UID: "ce8493fd-3e35-41fb-8daa-febd2238ce1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495912 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrl49\" (UniqueName: \"kubernetes.io/projected/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-kube-api-access-jrl49\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495932 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjz72\" (UniqueName: \"kubernetes.io/projected/1e559975-5aca-457e-8c50-465552595381-kube-api-access-mjz72\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495944 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce8493fd-3e35-41fb-8daa-febd2238ce1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.495959 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.496581 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb8a8892-82ee-4502-b76d-ca289485809b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb8a8892-82ee-4502-b76d-ca289485809b" (UID: "eb8a8892-82ee-4502-b76d-ca289485809b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.498490 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8493fd-3e35-41fb-8daa-febd2238ce1b-kube-api-access-26s4h" (OuterVolumeSpecName: "kube-api-access-26s4h") pod "ce8493fd-3e35-41fb-8daa-febd2238ce1b" (UID: "ce8493fd-3e35-41fb-8daa-febd2238ce1b"). InnerVolumeSpecName "kube-api-access-26s4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.498559 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8a8892-82ee-4502-b76d-ca289485809b-kube-api-access-v2gfv" (OuterVolumeSpecName: "kube-api-access-v2gfv") pod "eb8a8892-82ee-4502-b76d-ca289485809b" (UID: "eb8a8892-82ee-4502-b76d-ca289485809b"). InnerVolumeSpecName "kube-api-access-v2gfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.504475 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" path="/var/lib/kubelet/pods/7919b4ef-74fc-472e-9a5e-04216cc51ae5/volumes" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.597091 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2gfv\" (UniqueName: \"kubernetes.io/projected/eb8a8892-82ee-4502-b76d-ca289485809b-kube-api-access-v2gfv\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.597135 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26s4h\" (UniqueName: \"kubernetes.io/projected/ce8493fd-3e35-41fb-8daa-febd2238ce1b-kube-api-access-26s4h\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.597150 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb8a8892-82ee-4502-b76d-ca289485809b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.612669 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-87jsw" event={"ID":"ce8493fd-3e35-41fb-8daa-febd2238ce1b","Type":"ContainerDied","Data":"2fffaad6d5a06f655ff4c8d188a160c681cfc6e69dc4ef7ffcfa90f4d6f269b4"} Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.612692 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-87jsw" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.612714 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fffaad6d5a06f655ff4c8d188a160c681cfc6e69dc4ef7ffcfa90f4d6f269b4" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.616257 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c596-account-create-update-kw2ql" event={"ID":"1e559975-5aca-457e-8c50-465552595381","Type":"ContainerDied","Data":"12f3735d737a130d7a9095b30d8233ba764f6de01a5c6fe30de6758919c90b92"} Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.616426 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12f3735d737a130d7a9095b30d8233ba764f6de01a5c6fe30de6758919c90b92" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.616352 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c596-account-create-update-kw2ql" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.617686 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kslsb" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.617703 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kslsb" event={"ID":"eb8a8892-82ee-4502-b76d-ca289485809b","Type":"ContainerDied","Data":"00b76c7be6d0fdb000f31c1e38cf50ca0b2d1f46c57eb45e800949b818b564a5"} Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.617747 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b76c7be6d0fdb000f31c1e38cf50ca0b2d1f46c57eb45e800949b818b564a5" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.619975 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gk626" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.620382 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gk626" event={"ID":"51bf320d-e5c9-43c6-baf7-e1f2f9ee3313","Type":"ContainerDied","Data":"2dfd9c4d7dc7e12afe66ccf426b7bea55f65eaade1defa6244ce76ada3b5ba5c"} Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.620742 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dfd9c4d7dc7e12afe66ccf426b7bea55f65eaade1defa6244ce76ada3b5ba5c" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.621512 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-15a0-account-create-update-b4wdv" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.622515 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-15a0-account-create-update-b4wdv" event={"ID":"2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496","Type":"ContainerDied","Data":"bf0bf6219a3827f9978a4d657d9f0c8902ab0058726263be468bd1610eca879f"} Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.622825 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0bf6219a3827f9978a4d657d9f0c8902ab0058726263be468bd1610eca879f" Mar 12 18:22:06 crc kubenswrapper[4926]: I0312 18:22:06.966554 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.104408 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9pnj\" (UniqueName: \"kubernetes.io/projected/3def1951-b6f9-4621-8428-b3e169e34279-kube-api-access-l9pnj\") pod \"3def1951-b6f9-4621-8428-b3e169e34279\" (UID: \"3def1951-b6f9-4621-8428-b3e169e34279\") " Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.108632 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3def1951-b6f9-4621-8428-b3e169e34279-kube-api-access-l9pnj" (OuterVolumeSpecName: "kube-api-access-l9pnj") pod "3def1951-b6f9-4621-8428-b3e169e34279" (UID: "3def1951-b6f9-4621-8428-b3e169e34279"). InnerVolumeSpecName "kube-api-access-l9pnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.114049 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.206092 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9pnj\" (UniqueName: \"kubernetes.io/projected/3def1951-b6f9-4621-8428-b3e169e34279-kube-api-access-l9pnj\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.307352 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8brwx\" (UniqueName: \"kubernetes.io/projected/3f98d642-e5f2-44de-9259-15b5eed6b80c-kube-api-access-8brwx\") pod \"3f98d642-e5f2-44de-9259-15b5eed6b80c\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.307743 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f98d642-e5f2-44de-9259-15b5eed6b80c-operator-scripts\") pod \"3f98d642-e5f2-44de-9259-15b5eed6b80c\" (UID: \"3f98d642-e5f2-44de-9259-15b5eed6b80c\") " Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.308395 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f98d642-e5f2-44de-9259-15b5eed6b80c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f98d642-e5f2-44de-9259-15b5eed6b80c" (UID: "3f98d642-e5f2-44de-9259-15b5eed6b80c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.311906 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f98d642-e5f2-44de-9259-15b5eed6b80c-kube-api-access-8brwx" (OuterVolumeSpecName: "kube-api-access-8brwx") pod "3f98d642-e5f2-44de-9259-15b5eed6b80c" (UID: "3f98d642-e5f2-44de-9259-15b5eed6b80c"). InnerVolumeSpecName "kube-api-access-8brwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.318916 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bgd7w"] Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319423 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerName="dnsmasq-dns" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319479 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerName="dnsmasq-dns" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319508 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8a8892-82ee-4502-b76d-ca289485809b" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319520 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8a8892-82ee-4502-b76d-ca289485809b" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319545 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f98d642-e5f2-44de-9259-15b5eed6b80c" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319558 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f98d642-e5f2-44de-9259-15b5eed6b80c" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319600 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e559975-5aca-457e-8c50-465552595381" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319612 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e559975-5aca-457e-8c50-465552595381" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319632 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8493fd-3e35-41fb-8daa-febd2238ce1b" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319644 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8493fd-3e35-41fb-8daa-febd2238ce1b" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319664 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerName="init" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319676 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerName="init" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319694 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3def1951-b6f9-4621-8428-b3e169e34279" containerName="oc" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319706 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="3def1951-b6f9-4621-8428-b3e169e34279" containerName="oc" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319733 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319745 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: E0312 18:22:07.319763 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.319776 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320064 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7919b4ef-74fc-472e-9a5e-04216cc51ae5" containerName="dnsmasq-dns" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320091 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="3def1951-b6f9-4621-8428-b3e169e34279" containerName="oc" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320112 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f98d642-e5f2-44de-9259-15b5eed6b80c" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320132 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e559975-5aca-457e-8c50-465552595381" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320146 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" containerName="mariadb-account-create-update" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320160 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320182 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8493fd-3e35-41fb-8daa-febd2238ce1b" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.320203 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8a8892-82ee-4502-b76d-ca289485809b" containerName="mariadb-database-create" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.321008 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.327497 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.335166 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bgd7w"] Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.410320 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8brwx\" (UniqueName: \"kubernetes.io/projected/3f98d642-e5f2-44de-9259-15b5eed6b80c-kube-api-access-8brwx\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.410365 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f98d642-e5f2-44de-9259-15b5eed6b80c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.512235 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x52cf\" (UniqueName: \"kubernetes.io/projected/88fe4c28-3abc-43de-8686-83c78b0633f8-kube-api-access-x52cf\") pod \"root-account-create-update-bgd7w\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.512302 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88fe4c28-3abc-43de-8686-83c78b0633f8-operator-scripts\") pod \"root-account-create-update-bgd7w\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.614729 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x52cf\" (UniqueName: \"kubernetes.io/projected/88fe4c28-3abc-43de-8686-83c78b0633f8-kube-api-access-x52cf\") pod \"root-account-create-update-bgd7w\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.614859 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88fe4c28-3abc-43de-8686-83c78b0633f8-operator-scripts\") pod \"root-account-create-update-bgd7w\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.617766 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88fe4c28-3abc-43de-8686-83c78b0633f8-operator-scripts\") pod \"root-account-create-update-bgd7w\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.659385 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" event={"ID":"3def1951-b6f9-4621-8428-b3e169e34279","Type":"ContainerDied","Data":"9995b1f578722c4da205a824f7fc383db43c0b559444cc089533ad24be578f7c"} Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.659473 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9995b1f578722c4da205a824f7fc383db43c0b559444cc089533ad24be578f7c" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.659550 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555662-wmwgq" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.675168 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x52cf\" (UniqueName: \"kubernetes.io/projected/88fe4c28-3abc-43de-8686-83c78b0633f8-kube-api-access-x52cf\") pod \"root-account-create-update-bgd7w\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.677375 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b886-account-create-update-sdqf6" event={"ID":"3f98d642-e5f2-44de-9259-15b5eed6b80c","Type":"ContainerDied","Data":"2b00f6a8874e175c9a39535d556a6fbc2f2a3ee6efb123f9a4087969dcab6675"} Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.677431 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b00f6a8874e175c9a39535d556a6fbc2f2a3ee6efb123f9a4087969dcab6675" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.677527 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b886-account-create-update-sdqf6" Mar 12 18:22:07 crc kubenswrapper[4926]: I0312 18:22:07.682169 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.063295 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555656-zlrgw"] Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.071423 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555656-zlrgw"] Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.171941 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bgd7w"] Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.507914 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b042fb81-959e-48c0-8a9e-87bafcad2fe3" path="/var/lib/kubelet/pods/b042fb81-959e-48c0-8a9e-87bafcad2fe3/volumes" Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.688146 4926 generic.go:334] "Generic (PLEG): container finished" podID="88fe4c28-3abc-43de-8686-83c78b0633f8" containerID="bf4088b7010b072d07c97127da27603d49f76d54fb965fcf30cdce478bed5715" exitCode=0 Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.688187 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bgd7w" event={"ID":"88fe4c28-3abc-43de-8686-83c78b0633f8","Type":"ContainerDied","Data":"bf4088b7010b072d07c97127da27603d49f76d54fb965fcf30cdce478bed5715"} Mar 12 18:22:08 crc kubenswrapper[4926]: I0312 18:22:08.688212 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bgd7w" event={"ID":"88fe4c28-3abc-43de-8686-83c78b0633f8","Type":"ContainerStarted","Data":"ff6a1dc4d902432ca748504cde2a8c61dd124de912bd1cf55d1e3457e983d254"} Mar 12 18:22:09 crc kubenswrapper[4926]: I0312 18:22:09.853537 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:22:09 crc kubenswrapper[4926]: E0312 18:22:09.853842 4926 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 18:22:09 crc kubenswrapper[4926]: E0312 18:22:09.853874 4926 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 18:22:09 crc kubenswrapper[4926]: E0312 18:22:09.854036 4926 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift podName:57853681-32de-4475-9c7d-3f9708fe7d91 nodeName:}" failed. No retries permitted until 2026-03-12 18:22:25.854009492 +0000 UTC m=+1186.222635825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift") pod "swift-storage-0" (UID: "57853681-32de-4475-9c7d-3f9708fe7d91") : configmap "swift-ring-files" not found Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.038959 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.159292 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x52cf\" (UniqueName: \"kubernetes.io/projected/88fe4c28-3abc-43de-8686-83c78b0633f8-kube-api-access-x52cf\") pod \"88fe4c28-3abc-43de-8686-83c78b0633f8\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.159647 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88fe4c28-3abc-43de-8686-83c78b0633f8-operator-scripts\") pod \"88fe4c28-3abc-43de-8686-83c78b0633f8\" (UID: \"88fe4c28-3abc-43de-8686-83c78b0633f8\") " Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.160226 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88fe4c28-3abc-43de-8686-83c78b0633f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88fe4c28-3abc-43de-8686-83c78b0633f8" (UID: "88fe4c28-3abc-43de-8686-83c78b0633f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.166713 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88fe4c28-3abc-43de-8686-83c78b0633f8-kube-api-access-x52cf" (OuterVolumeSpecName: "kube-api-access-x52cf") pod "88fe4c28-3abc-43de-8686-83c78b0633f8" (UID: "88fe4c28-3abc-43de-8686-83c78b0633f8"). InnerVolumeSpecName "kube-api-access-x52cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.261700 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x52cf\" (UniqueName: \"kubernetes.io/projected/88fe4c28-3abc-43de-8686-83c78b0633f8-kube-api-access-x52cf\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.261726 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88fe4c28-3abc-43de-8686-83c78b0633f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.711467 4926 generic.go:334] "Generic (PLEG): container finished" podID="9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" containerID="251cc81f683a75e3d21cb79796d4e3d3d467bddd70539c194beb7a011504559b" exitCode=0 Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.711559 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bn6gf" event={"ID":"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f","Type":"ContainerDied","Data":"251cc81f683a75e3d21cb79796d4e3d3d467bddd70539c194beb7a011504559b"} Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.714143 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bgd7w" event={"ID":"88fe4c28-3abc-43de-8686-83c78b0633f8","Type":"ContainerDied","Data":"ff6a1dc4d902432ca748504cde2a8c61dd124de912bd1cf55d1e3457e983d254"} Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.714200 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff6a1dc4d902432ca748504cde2a8c61dd124de912bd1cf55d1e3457e983d254" Mar 12 18:22:10 crc kubenswrapper[4926]: I0312 18:22:10.714522 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bgd7w" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.193914 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-72msf"] Mar 12 18:22:11 crc kubenswrapper[4926]: E0312 18:22:11.194198 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88fe4c28-3abc-43de-8686-83c78b0633f8" containerName="mariadb-account-create-update" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.194210 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="88fe4c28-3abc-43de-8686-83c78b0633f8" containerName="mariadb-account-create-update" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.194357 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="88fe4c28-3abc-43de-8686-83c78b0633f8" containerName="mariadb-account-create-update" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.194848 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.198082 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2zfbp" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.204953 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.207668 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-72msf"] Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.280760 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-db-sync-config-data\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.280829 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-combined-ca-bundle\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.280880 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9kjm\" (UniqueName: \"kubernetes.io/projected/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-kube-api-access-q9kjm\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.280911 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-config-data\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.382829 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9kjm\" (UniqueName: \"kubernetes.io/projected/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-kube-api-access-q9kjm\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.382896 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-config-data\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.383049 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-db-sync-config-data\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.383095 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-combined-ca-bundle\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.386655 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-db-sync-config-data\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.388009 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-combined-ca-bundle\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.392087 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-config-data\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.409016 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9kjm\" (UniqueName: \"kubernetes.io/projected/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-kube-api-access-q9kjm\") pod \"glance-db-sync-72msf\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " pod="openstack/glance-db-sync-72msf" Mar 12 18:22:11 crc kubenswrapper[4926]: I0312 18:22:11.518198 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-72msf" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.040688 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.072559 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-72msf"] Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.199887 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bg7n\" (UniqueName: \"kubernetes.io/projected/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-kube-api-access-6bg7n\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.199977 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-dispersionconf\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.200069 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-ring-data-devices\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.200114 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-combined-ca-bundle\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.200208 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-etc-swift\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.200255 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-swiftconf\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.200318 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-scripts\") pod \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\" (UID: \"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f\") " Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.202121 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.202237 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.206726 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-kube-api-access-6bg7n" (OuterVolumeSpecName: "kube-api-access-6bg7n") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "kube-api-access-6bg7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.212152 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.223007 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-scripts" (OuterVolumeSpecName: "scripts") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.229716 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.236903 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" (UID: "9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302533 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302569 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bg7n\" (UniqueName: \"kubernetes.io/projected/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-kube-api-access-6bg7n\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302580 4926 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302590 4926 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302599 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302607 4926 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.302618 4926 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.730034 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-bn6gf" event={"ID":"9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f","Type":"ContainerDied","Data":"beae81a2d2cc5a26e427094e9b94a43594531b558df0f2e26a7a5160b7c22dda"} Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.730085 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beae81a2d2cc5a26e427094e9b94a43594531b558df0f2e26a7a5160b7c22dda" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.730094 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bn6gf" Mar 12 18:22:12 crc kubenswrapper[4926]: I0312 18:22:12.730991 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-72msf" event={"ID":"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8","Type":"ContainerStarted","Data":"0b8074caa3b0c10eb9a072ab0513f4ac17eb3bec44cc3edb55302dc941d344c0"} Mar 12 18:22:13 crc kubenswrapper[4926]: I0312 18:22:13.709000 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bgd7w"] Mar 12 18:22:13 crc kubenswrapper[4926]: I0312 18:22:13.715672 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bgd7w"] Mar 12 18:22:14 crc kubenswrapper[4926]: I0312 18:22:14.047201 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 12 18:22:14 crc kubenswrapper[4926]: I0312 18:22:14.499062 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88fe4c28-3abc-43de-8686-83c78b0633f8" path="/var/lib/kubelet/pods/88fe4c28-3abc-43de-8686-83c78b0633f8/volumes" Mar 12 18:22:15 crc kubenswrapper[4926]: I0312 18:22:15.768569 4926 generic.go:334] "Generic (PLEG): container finished" podID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerID="68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5" exitCode=0 Mar 12 18:22:15 crc kubenswrapper[4926]: I0312 18:22:15.768970 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9c04aaec-485d-492f-8c24-e6860d9c78f7","Type":"ContainerDied","Data":"68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5"} Mar 12 18:22:15 crc kubenswrapper[4926]: I0312 18:22:15.773336 4926 generic.go:334] "Generic (PLEG): container finished" podID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerID="cfa97fbf75b6c2e852f5d66a227450ea6c8da1b0641deb3da6fb3c35ba9a0f12" exitCode=0 Mar 12 18:22:15 crc kubenswrapper[4926]: I0312 18:22:15.773369 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"06f09c04-6c8d-4c47-a0a5-59def6ebbf94","Type":"ContainerDied","Data":"cfa97fbf75b6c2e852f5d66a227450ea6c8da1b0641deb3da6fb3c35ba9a0f12"} Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.258026 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.269587 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-6znbv" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.547685 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sfwpr-config-m7tgt"] Mar 12 18:22:16 crc kubenswrapper[4926]: E0312 18:22:16.548880 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" containerName="swift-ring-rebalance" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.548904 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" containerName="swift-ring-rebalance" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.549772 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f" containerName="swift-ring-rebalance" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.551371 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sfwpr-config-m7tgt"] Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.551637 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.554872 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.702754 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-scripts\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.703027 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9bbf\" (UniqueName: \"kubernetes.io/projected/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-kube-api-access-v9bbf\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.703155 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-additional-scripts\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.703271 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.703393 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run-ovn\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.703634 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-log-ovn\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806179 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-scripts\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806257 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9bbf\" (UniqueName: \"kubernetes.io/projected/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-kube-api-access-v9bbf\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806284 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-additional-scripts\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806322 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806362 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run-ovn\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806406 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-log-ovn\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.806722 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-log-ovn\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.808064 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-additional-scripts\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.808397 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.808588 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run-ovn\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.808604 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-scripts\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.809109 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9c04aaec-485d-492f-8c24-e6860d9c78f7","Type":"ContainerStarted","Data":"ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a"} Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.810181 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.819407 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"06f09c04-6c8d-4c47-a0a5-59def6ebbf94","Type":"ContainerStarted","Data":"3e4a5dd026300b0565c4a11203b217973524caa9bcc9839e71d49583b866e88e"} Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.820345 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.854432 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9bbf\" (UniqueName: \"kubernetes.io/projected/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-kube-api-access-v9bbf\") pod \"ovn-controller-sfwpr-config-m7tgt\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.862144 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.244234415 podStartE2EDuration="1m0.862124989s" podCreationTimestamp="2026-03-12 18:21:16 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.099916532 +0000 UTC m=+1131.468542865" lastFinishedPulling="2026-03-12 18:21:41.717807106 +0000 UTC m=+1142.086433439" observedRunningTime="2026-03-12 18:22:16.847796143 +0000 UTC m=+1177.216422476" watchObservedRunningTime="2026-03-12 18:22:16.862124989 +0000 UTC m=+1177.230751322" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.879432 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:16 crc kubenswrapper[4926]: I0312 18:22:16.889804 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.307808794 podStartE2EDuration="1m0.889749979s" podCreationTimestamp="2026-03-12 18:21:16 +0000 UTC" firstStartedPulling="2026-03-12 18:21:31.325478209 +0000 UTC m=+1131.694104542" lastFinishedPulling="2026-03-12 18:21:40.907419394 +0000 UTC m=+1141.276045727" observedRunningTime="2026-03-12 18:22:16.885823736 +0000 UTC m=+1177.254450069" watchObservedRunningTime="2026-03-12 18:22:16.889749979 +0000 UTC m=+1177.258376312" Mar 12 18:22:17 crc kubenswrapper[4926]: I0312 18:22:17.430465 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sfwpr-config-m7tgt"] Mar 12 18:22:17 crc kubenswrapper[4926]: I0312 18:22:17.826763 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-m7tgt" event={"ID":"eab8aa6f-7146-49ab-bedf-e7dae33f96bf","Type":"ContainerStarted","Data":"f80e900658b9e00858dd34680664f8b822bb4e0eba4897392ac1a1da7732a56c"} Mar 12 18:22:17 crc kubenswrapper[4926]: I0312 18:22:17.827025 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-m7tgt" event={"ID":"eab8aa6f-7146-49ab-bedf-e7dae33f96bf","Type":"ContainerStarted","Data":"63ca14d801a286c7a6026983b605cad8b07e6eedc355463549b31b37de421eb4"} Mar 12 18:22:17 crc kubenswrapper[4926]: I0312 18:22:17.842555 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sfwpr-config-m7tgt" podStartSLOduration=1.842538472 podStartE2EDuration="1.842538472s" podCreationTimestamp="2026-03-12 18:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:17.842175501 +0000 UTC m=+1178.210801834" watchObservedRunningTime="2026-03-12 18:22:17.842538472 +0000 UTC m=+1178.211164805" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.716288 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jlt78"] Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.717742 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.721895 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.728863 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jlt78"] Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.839696 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvnd6\" (UniqueName: \"kubernetes.io/projected/349ca4f5-349b-45ab-98a4-844fa00599d0-kube-api-access-tvnd6\") pod \"root-account-create-update-jlt78\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.839747 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ca4f5-349b-45ab-98a4-844fa00599d0-operator-scripts\") pod \"root-account-create-update-jlt78\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.851800 4926 generic.go:334] "Generic (PLEG): container finished" podID="eab8aa6f-7146-49ab-bedf-e7dae33f96bf" containerID="f80e900658b9e00858dd34680664f8b822bb4e0eba4897392ac1a1da7732a56c" exitCode=0 Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.851838 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-m7tgt" event={"ID":"eab8aa6f-7146-49ab-bedf-e7dae33f96bf","Type":"ContainerDied","Data":"f80e900658b9e00858dd34680664f8b822bb4e0eba4897392ac1a1da7732a56c"} Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.941148 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvnd6\" (UniqueName: \"kubernetes.io/projected/349ca4f5-349b-45ab-98a4-844fa00599d0-kube-api-access-tvnd6\") pod \"root-account-create-update-jlt78\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.941230 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ca4f5-349b-45ab-98a4-844fa00599d0-operator-scripts\") pod \"root-account-create-update-jlt78\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.942083 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ca4f5-349b-45ab-98a4-844fa00599d0-operator-scripts\") pod \"root-account-create-update-jlt78\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:18 crc kubenswrapper[4926]: I0312 18:22:18.969478 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvnd6\" (UniqueName: \"kubernetes.io/projected/349ca4f5-349b-45ab-98a4-844fa00599d0-kube-api-access-tvnd6\") pod \"root-account-create-update-jlt78\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:19 crc kubenswrapper[4926]: I0312 18:22:19.034129 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:25 crc kubenswrapper[4926]: I0312 18:22:25.867405 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:22:26 crc kubenswrapper[4926]: I0312 18:22:25.876342 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57853681-32de-4475-9c7d-3f9708fe7d91-etc-swift\") pod \"swift-storage-0\" (UID: \"57853681-32de-4475-9c7d-3f9708fe7d91\") " pod="openstack/swift-storage-0" Mar 12 18:22:26 crc kubenswrapper[4926]: I0312 18:22:26.011875 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 18:22:26 crc kubenswrapper[4926]: I0312 18:22:26.230131 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sfwpr" Mar 12 18:22:27 crc kubenswrapper[4926]: I0312 18:22:27.506717 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:22:27 crc kubenswrapper[4926]: I0312 18:22:27.771628 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 18:22:27 crc kubenswrapper[4926]: I0312 18:22:27.965498 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:27 crc kubenswrapper[4926]: I0312 18:22:27.976257 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-m7tgt" event={"ID":"eab8aa6f-7146-49ab-bedf-e7dae33f96bf","Type":"ContainerDied","Data":"63ca14d801a286c7a6026983b605cad8b07e6eedc355463549b31b37de421eb4"} Mar 12 18:22:27 crc kubenswrapper[4926]: I0312 18:22:27.976521 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ca14d801a286c7a6026983b605cad8b07e6eedc355463549b31b37de421eb4" Mar 12 18:22:27 crc kubenswrapper[4926]: I0312 18:22:27.976404 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-m7tgt" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004645 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9bbf\" (UniqueName: \"kubernetes.io/projected/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-kube-api-access-v9bbf\") pod \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004753 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run-ovn\") pod \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004797 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-scripts\") pod \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004860 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-log-ovn\") pod \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004890 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run\") pod \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004914 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eab8aa6f-7146-49ab-bedf-e7dae33f96bf" (UID: "eab8aa6f-7146-49ab-bedf-e7dae33f96bf"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004939 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-additional-scripts\") pod \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\" (UID: \"eab8aa6f-7146-49ab-bedf-e7dae33f96bf\") " Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.004965 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eab8aa6f-7146-49ab-bedf-e7dae33f96bf" (UID: "eab8aa6f-7146-49ab-bedf-e7dae33f96bf"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005028 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run" (OuterVolumeSpecName: "var-run") pod "eab8aa6f-7146-49ab-bedf-e7dae33f96bf" (UID: "eab8aa6f-7146-49ab-bedf-e7dae33f96bf"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005515 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eab8aa6f-7146-49ab-bedf-e7dae33f96bf" (UID: "eab8aa6f-7146-49ab-bedf-e7dae33f96bf"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005621 4926 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005635 4926 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005646 4926 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-var-run\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005657 4926 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.005818 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-scripts" (OuterVolumeSpecName: "scripts") pod "eab8aa6f-7146-49ab-bedf-e7dae33f96bf" (UID: "eab8aa6f-7146-49ab-bedf-e7dae33f96bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.013507 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-kube-api-access-v9bbf" (OuterVolumeSpecName: "kube-api-access-v9bbf") pod "eab8aa6f-7146-49ab-bedf-e7dae33f96bf" (UID: "eab8aa6f-7146-49ab-bedf-e7dae33f96bf"). InnerVolumeSpecName "kube-api-access-v9bbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.107523 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9bbf\" (UniqueName: \"kubernetes.io/projected/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-kube-api-access-v9bbf\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.107559 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eab8aa6f-7146-49ab-bedf-e7dae33f96bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.313769 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jlt78"] Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.502267 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 18:22:28 crc kubenswrapper[4926]: W0312 18:22:28.505300 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57853681_32de_4475_9c7d_3f9708fe7d91.slice/crio-d44e9a8ee5ceb248e04686de962e1c80c3449a6c3bd013213aff0a7fef9a9e4c WatchSource:0}: Error finding container d44e9a8ee5ceb248e04686de962e1c80c3449a6c3bd013213aff0a7fef9a9e4c: Status 404 returned error can't find the container with id d44e9a8ee5ceb248e04686de962e1c80c3449a6c3bd013213aff0a7fef9a9e4c Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.984072 4926 generic.go:334] "Generic (PLEG): container finished" podID="349ca4f5-349b-45ab-98a4-844fa00599d0" containerID="c3e155687eb67c8841a428e422ae982d9a3f9a577c50dd1606a11fd1ad93242c" exitCode=0 Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.984141 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlt78" event={"ID":"349ca4f5-349b-45ab-98a4-844fa00599d0","Type":"ContainerDied","Data":"c3e155687eb67c8841a428e422ae982d9a3f9a577c50dd1606a11fd1ad93242c"} Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.984164 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlt78" event={"ID":"349ca4f5-349b-45ab-98a4-844fa00599d0","Type":"ContainerStarted","Data":"5ee494a44c685214964380e7c1b459b8f01159d69ae5d0dd6ee963a73f27e878"} Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.985369 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-72msf" event={"ID":"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8","Type":"ContainerStarted","Data":"f8a8e8e5fcda402042df2295b1c2cbd321aa245d9b7be946af5559093b56242a"} Mar 12 18:22:28 crc kubenswrapper[4926]: I0312 18:22:28.986236 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"d44e9a8ee5ceb248e04686de962e1c80c3449a6c3bd013213aff0a7fef9a9e4c"} Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.031908 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-72msf" podStartSLOduration=2.241839521 podStartE2EDuration="18.031883259s" podCreationTimestamp="2026-03-12 18:22:11 +0000 UTC" firstStartedPulling="2026-03-12 18:22:12.075050894 +0000 UTC m=+1172.443677247" lastFinishedPulling="2026-03-12 18:22:27.865094652 +0000 UTC m=+1188.233720985" observedRunningTime="2026-03-12 18:22:29.025243513 +0000 UTC m=+1189.393869856" watchObservedRunningTime="2026-03-12 18:22:29.031883259 +0000 UTC m=+1189.400509592" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.098576 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sfwpr-config-m7tgt"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.104225 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sfwpr-config-m7tgt"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.201504 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sfwpr-config-55drx"] Mar 12 18:22:29 crc kubenswrapper[4926]: E0312 18:22:29.202263 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab8aa6f-7146-49ab-bedf-e7dae33f96bf" containerName="ovn-config" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.202276 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab8aa6f-7146-49ab-bedf-e7dae33f96bf" containerName="ovn-config" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.202451 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab8aa6f-7146-49ab-bedf-e7dae33f96bf" containerName="ovn-config" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.202961 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.204913 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.208975 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sfwpr-config-55drx"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.330779 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-additional-scripts\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.330839 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-scripts\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.330918 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drskx\" (UniqueName: \"kubernetes.io/projected/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-kube-api-access-drskx\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.331014 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-log-ovn\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.331164 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.331297 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run-ovn\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.405465 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-n9jj2"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.406501 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.425837 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-n9jj2"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433164 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-additional-scripts\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433212 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-scripts\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433263 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drskx\" (UniqueName: \"kubernetes.io/projected/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-kube-api-access-drskx\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433283 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-log-ovn\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433321 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433357 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run-ovn\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433642 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run-ovn\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433645 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-log-ovn\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.433661 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.434525 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-additional-scripts\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.439354 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-scripts\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.498663 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drskx\" (UniqueName: \"kubernetes.io/projected/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-kube-api-access-drskx\") pod \"ovn-controller-sfwpr-config-55drx\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.509773 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-40f3-account-create-update-6f5kz"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.510886 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.513620 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.519095 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-40f3-account-create-update-6f5kz"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.528102 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.534585 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rddff\" (UniqueName: \"kubernetes.io/projected/fa60a9bd-1933-44b8-ac17-be1f72c9da68-kube-api-access-rddff\") pod \"cinder-db-create-n9jj2\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.534626 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa60a9bd-1933-44b8-ac17-be1f72c9da68-operator-scripts\") pod \"cinder-db-create-n9jj2\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.595364 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-n28hs"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.599699 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.608340 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-n28hs"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.635623 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rddff\" (UniqueName: \"kubernetes.io/projected/fa60a9bd-1933-44b8-ac17-be1f72c9da68-kube-api-access-rddff\") pod \"cinder-db-create-n9jj2\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.635664 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa60a9bd-1933-44b8-ac17-be1f72c9da68-operator-scripts\") pod \"cinder-db-create-n9jj2\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.635712 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pb5c\" (UniqueName: \"kubernetes.io/projected/6a1a6049-a924-4e9e-adad-e6ec84732eb9-kube-api-access-9pb5c\") pod \"cinder-40f3-account-create-update-6f5kz\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.635758 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1a6049-a924-4e9e-adad-e6ec84732eb9-operator-scripts\") pod \"cinder-40f3-account-create-update-6f5kz\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.636769 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa60a9bd-1933-44b8-ac17-be1f72c9da68-operator-scripts\") pod \"cinder-db-create-n9jj2\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.672082 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rddff\" (UniqueName: \"kubernetes.io/projected/fa60a9bd-1933-44b8-ac17-be1f72c9da68-kube-api-access-rddff\") pod \"cinder-db-create-n9jj2\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.714986 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-bf50-account-create-update-ckzh6"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.716184 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.722114 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.723571 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.737149 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05959443-d099-4653-9736-7745ba1ce331-operator-scripts\") pod \"barbican-db-create-n28hs\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.737238 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pb5c\" (UniqueName: \"kubernetes.io/projected/6a1a6049-a924-4e9e-adad-e6ec84732eb9-kube-api-access-9pb5c\") pod \"cinder-40f3-account-create-update-6f5kz\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.737275 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7tgz\" (UniqueName: \"kubernetes.io/projected/05959443-d099-4653-9736-7745ba1ce331-kube-api-access-w7tgz\") pod \"barbican-db-create-n28hs\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.737314 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1a6049-a924-4e9e-adad-e6ec84732eb9-operator-scripts\") pod \"cinder-40f3-account-create-update-6f5kz\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.738100 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1a6049-a924-4e9e-adad-e6ec84732eb9-operator-scripts\") pod \"cinder-40f3-account-create-update-6f5kz\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.744068 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bf50-account-create-update-ckzh6"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.763288 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pb5c\" (UniqueName: \"kubernetes.io/projected/6a1a6049-a924-4e9e-adad-e6ec84732eb9-kube-api-access-9pb5c\") pod \"cinder-40f3-account-create-update-6f5kz\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.789013 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-4vtkq"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.789985 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.794687 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.794875 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.795428 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.795636 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vrsc7" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.825927 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qr6zk"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.826857 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.834368 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.838804 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nh4b\" (UniqueName: \"kubernetes.io/projected/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-kube-api-access-8nh4b\") pod \"barbican-bf50-account-create-update-ckzh6\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.838921 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7tgz\" (UniqueName: \"kubernetes.io/projected/05959443-d099-4653-9736-7745ba1ce331-kube-api-access-w7tgz\") pod \"barbican-db-create-n28hs\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.838972 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-operator-scripts\") pod \"barbican-bf50-account-create-update-ckzh6\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.839008 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-config-data\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.839069 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05959443-d099-4653-9736-7745ba1ce331-operator-scripts\") pod \"barbican-db-create-n28hs\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.839101 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8k85\" (UniqueName: \"kubernetes.io/projected/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-kube-api-access-w8k85\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.839120 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-combined-ca-bundle\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.840255 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05959443-d099-4653-9736-7745ba1ce331-operator-scripts\") pod \"barbican-db-create-n28hs\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.858511 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qr6zk"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.869693 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2c76-account-create-update-wvgzd"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.870682 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.872512 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.879904 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7tgz\" (UniqueName: \"kubernetes.io/projected/05959443-d099-4653-9736-7745ba1ce331-kube-api-access-w7tgz\") pod \"barbican-db-create-n28hs\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.892496 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2c76-account-create-update-wvgzd"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.907880 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4vtkq"] Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.935512 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940308 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8k85\" (UniqueName: \"kubernetes.io/projected/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-kube-api-access-w8k85\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940353 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-combined-ca-bundle\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940392 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nh4b\" (UniqueName: \"kubernetes.io/projected/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-kube-api-access-8nh4b\") pod \"barbican-bf50-account-create-update-ckzh6\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940427 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jl7q\" (UniqueName: \"kubernetes.io/projected/ea314c69-7524-43d6-9d4f-9fdb16510952-kube-api-access-7jl7q\") pod \"neutron-db-create-qr6zk\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940502 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-operator-scripts\") pod \"barbican-bf50-account-create-update-ckzh6\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940559 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea314c69-7524-43d6-9d4f-9fdb16510952-operator-scripts\") pod \"neutron-db-create-qr6zk\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940585 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-config-data\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940615 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917312-5922-44df-a838-6b5452e8bb84-operator-scripts\") pod \"neutron-2c76-account-create-update-wvgzd\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.940648 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r6dd\" (UniqueName: \"kubernetes.io/projected/f7917312-5922-44df-a838-6b5452e8bb84-kube-api-access-7r6dd\") pod \"neutron-2c76-account-create-update-wvgzd\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.941307 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-operator-scripts\") pod \"barbican-bf50-account-create-update-ckzh6\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.945731 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-combined-ca-bundle\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.951080 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-config-data\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.964959 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8k85\" (UniqueName: \"kubernetes.io/projected/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-kube-api-access-w8k85\") pod \"keystone-db-sync-4vtkq\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:29 crc kubenswrapper[4926]: I0312 18:22:29.982175 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nh4b\" (UniqueName: \"kubernetes.io/projected/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-kube-api-access-8nh4b\") pod \"barbican-bf50-account-create-update-ckzh6\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.037059 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.042491 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r6dd\" (UniqueName: \"kubernetes.io/projected/f7917312-5922-44df-a838-6b5452e8bb84-kube-api-access-7r6dd\") pod \"neutron-2c76-account-create-update-wvgzd\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.042668 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jl7q\" (UniqueName: \"kubernetes.io/projected/ea314c69-7524-43d6-9d4f-9fdb16510952-kube-api-access-7jl7q\") pod \"neutron-db-create-qr6zk\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.042727 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea314c69-7524-43d6-9d4f-9fdb16510952-operator-scripts\") pod \"neutron-db-create-qr6zk\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.042790 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917312-5922-44df-a838-6b5452e8bb84-operator-scripts\") pod \"neutron-2c76-account-create-update-wvgzd\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.043677 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917312-5922-44df-a838-6b5452e8bb84-operator-scripts\") pod \"neutron-2c76-account-create-update-wvgzd\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.043691 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea314c69-7524-43d6-9d4f-9fdb16510952-operator-scripts\") pod \"neutron-db-create-qr6zk\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.070974 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jl7q\" (UniqueName: \"kubernetes.io/projected/ea314c69-7524-43d6-9d4f-9fdb16510952-kube-api-access-7jl7q\") pod \"neutron-db-create-qr6zk\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.070995 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r6dd\" (UniqueName: \"kubernetes.io/projected/f7917312-5922-44df-a838-6b5452e8bb84-kube-api-access-7r6dd\") pod \"neutron-2c76-account-create-update-wvgzd\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.150511 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sfwpr-config-55drx"] Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.250182 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.266677 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.275991 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.316767 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-40f3-account-create-update-6f5kz"] Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.432909 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-n9jj2"] Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.500793 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab8aa6f-7146-49ab-bedf-e7dae33f96bf" path="/var/lib/kubelet/pods/eab8aa6f-7146-49ab-bedf-e7dae33f96bf/volumes" Mar 12 18:22:30 crc kubenswrapper[4926]: W0312 18:22:30.793903 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1a6049_a924_4e9e_adad_e6ec84732eb9.slice/crio-7f60838faf0603a5159f5f3ca03ddcba8992db0a0a7cb8b0a847b3f2ddc0a013 WatchSource:0}: Error finding container 7f60838faf0603a5159f5f3ca03ddcba8992db0a0a7cb8b0a847b3f2ddc0a013: Status 404 returned error can't find the container with id 7f60838faf0603a5159f5f3ca03ddcba8992db0a0a7cb8b0a847b3f2ddc0a013 Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.855314 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.958575 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ca4f5-349b-45ab-98a4-844fa00599d0-operator-scripts\") pod \"349ca4f5-349b-45ab-98a4-844fa00599d0\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.958803 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvnd6\" (UniqueName: \"kubernetes.io/projected/349ca4f5-349b-45ab-98a4-844fa00599d0-kube-api-access-tvnd6\") pod \"349ca4f5-349b-45ab-98a4-844fa00599d0\" (UID: \"349ca4f5-349b-45ab-98a4-844fa00599d0\") " Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.962575 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349ca4f5-349b-45ab-98a4-844fa00599d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "349ca4f5-349b-45ab-98a4-844fa00599d0" (UID: "349ca4f5-349b-45ab-98a4-844fa00599d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:30 crc kubenswrapper[4926]: I0312 18:22:30.968510 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349ca4f5-349b-45ab-98a4-844fa00599d0-kube-api-access-tvnd6" (OuterVolumeSpecName: "kube-api-access-tvnd6") pod "349ca4f5-349b-45ab-98a4-844fa00599d0" (UID: "349ca4f5-349b-45ab-98a4-844fa00599d0"). InnerVolumeSpecName "kube-api-access-tvnd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.066548 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/349ca4f5-349b-45ab-98a4-844fa00599d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.066875 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvnd6\" (UniqueName: \"kubernetes.io/projected/349ca4f5-349b-45ab-98a4-844fa00599d0-kube-api-access-tvnd6\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.083020 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jlt78" Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.084263 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jlt78" event={"ID":"349ca4f5-349b-45ab-98a4-844fa00599d0","Type":"ContainerDied","Data":"5ee494a44c685214964380e7c1b459b8f01159d69ae5d0dd6ee963a73f27e878"} Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.084306 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee494a44c685214964380e7c1b459b8f01159d69ae5d0dd6ee963a73f27e878" Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.087952 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-40f3-account-create-update-6f5kz" event={"ID":"6a1a6049-a924-4e9e-adad-e6ec84732eb9","Type":"ContainerStarted","Data":"7f60838faf0603a5159f5f3ca03ddcba8992db0a0a7cb8b0a847b3f2ddc0a013"} Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.098682 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-n9jj2" event={"ID":"fa60a9bd-1933-44b8-ac17-be1f72c9da68","Type":"ContainerStarted","Data":"2b292237feadafee95b31e06cfc840f72c515e88b8454c8337e1e74e28cd7576"} Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.133384 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-55drx" event={"ID":"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d","Type":"ContainerStarted","Data":"1b64556f5bf261422bd7a946c6906ac510a7650f9110c56570560ce6cb0ceff7"} Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.531119 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2c76-account-create-update-wvgzd"] Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.538400 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-n28hs"] Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.615644 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-bf50-account-create-update-ckzh6"] Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.626821 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qr6zk"] Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.719903 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-4vtkq"] Mar 12 18:22:31 crc kubenswrapper[4926]: W0312 18:22:31.730812 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea314c69_7524_43d6_9d4f_9fdb16510952.slice/crio-c5f3bf41c45721fdd76098163bfbbd5fb8a4d739371fec426b54c95a8d7546f5 WatchSource:0}: Error finding container c5f3bf41c45721fdd76098163bfbbd5fb8a4d739371fec426b54c95a8d7546f5: Status 404 returned error can't find the container with id c5f3bf41c45721fdd76098163bfbbd5fb8a4d739371fec426b54c95a8d7546f5 Mar 12 18:22:31 crc kubenswrapper[4926]: W0312 18:22:31.733915 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e5bf33c_5cb6_4b29_95f4_fcf47183be58.slice/crio-c56db79b42dcfa1fc516e7e174210c09707255662ee7602092c441dc8b95945d WatchSource:0}: Error finding container c56db79b42dcfa1fc516e7e174210c09707255662ee7602092c441dc8b95945d: Status 404 returned error can't find the container with id c56db79b42dcfa1fc516e7e174210c09707255662ee7602092c441dc8b95945d Mar 12 18:22:31 crc kubenswrapper[4926]: I0312 18:22:31.739711 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.141052 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vtkq" event={"ID":"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d","Type":"ContainerStarted","Data":"7c7e1f1cb85b710df8d9fc682c9ab5a089b0604e2b56bc8fd5036a48bcce3088"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.142844 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2c76-account-create-update-wvgzd" event={"ID":"f7917312-5922-44df-a838-6b5452e8bb84","Type":"ContainerStarted","Data":"1a95a2291b71efd71b4d69e1bff55bb10de0a0cbc11d599a6fbdc537cf2a1c21"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.142876 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2c76-account-create-update-wvgzd" event={"ID":"f7917312-5922-44df-a838-6b5452e8bb84","Type":"ContainerStarted","Data":"63fccee8b612554f7b094c76fecdac4ac35f19b95dc6b9a6784a3936c615d9a5"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.144358 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf50-account-create-update-ckzh6" event={"ID":"9e5bf33c-5cb6-4b29-95f4-fcf47183be58","Type":"ContainerStarted","Data":"ff84043d3f919db7adaecbb9d5f2f9199eabe47965d6033ec791be724e00a8f2"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.144380 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf50-account-create-update-ckzh6" event={"ID":"9e5bf33c-5cb6-4b29-95f4-fcf47183be58","Type":"ContainerStarted","Data":"c56db79b42dcfa1fc516e7e174210c09707255662ee7602092c441dc8b95945d"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.146013 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6zk" event={"ID":"ea314c69-7524-43d6-9d4f-9fdb16510952","Type":"ContainerStarted","Data":"7a2cc30c7cfe8ca6acf3b5029da578678f6497bf01186f5ff5a4caf50683498c"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.146056 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6zk" event={"ID":"ea314c69-7524-43d6-9d4f-9fdb16510952","Type":"ContainerStarted","Data":"c5f3bf41c45721fdd76098163bfbbd5fb8a4d739371fec426b54c95a8d7546f5"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.147580 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n28hs" event={"ID":"05959443-d099-4653-9736-7745ba1ce331","Type":"ContainerStarted","Data":"f7b35eaf50020882859b6b648c035a75d02c47868f155d967f93cf393c62f39e"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.147615 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n28hs" event={"ID":"05959443-d099-4653-9736-7745ba1ce331","Type":"ContainerStarted","Data":"def54e7f3bca68561150fcadb909db84a99aaf5cfe2458f9499bbf1014d7ccf8"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.149173 4926 generic.go:334] "Generic (PLEG): container finished" podID="fa60a9bd-1933-44b8-ac17-be1f72c9da68" containerID="12643d715a7a25afe43a9b8c6f2f23a51e1343ba5e8cfc3caebf03243d5ac109" exitCode=0 Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.149239 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-n9jj2" event={"ID":"fa60a9bd-1933-44b8-ac17-be1f72c9da68","Type":"ContainerDied","Data":"12643d715a7a25afe43a9b8c6f2f23a51e1343ba5e8cfc3caebf03243d5ac109"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.150819 4926 generic.go:334] "Generic (PLEG): container finished" podID="1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" containerID="f2d7197b9b3bf88ce380eff3700bdf47487a3209f60cede9380f4a5b74fd2a07" exitCode=0 Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.150858 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-55drx" event={"ID":"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d","Type":"ContainerDied","Data":"f2d7197b9b3bf88ce380eff3700bdf47487a3209f60cede9380f4a5b74fd2a07"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.154076 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"3ed8b79685248ae25cde3e72f762ac91192ada9775d542cdf62b1115d6fa4f4e"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.154106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"872a3c331ab2c6465434c4ee0f90bfa5304ded6c4675c8f9b4a055f0cc249520"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.154116 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"c8d0099df2ef97a7e69d18ad8472352f918822e9cbc54b1cba03b6ac41eaeb75"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.155517 4926 generic.go:334] "Generic (PLEG): container finished" podID="6a1a6049-a924-4e9e-adad-e6ec84732eb9" containerID="c4054fe5394aadede666252204c9b503dc03ddd88691e230f2deda1936fe7bc5" exitCode=0 Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.155557 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-40f3-account-create-update-6f5kz" event={"ID":"6a1a6049-a924-4e9e-adad-e6ec84732eb9","Type":"ContainerDied","Data":"c4054fe5394aadede666252204c9b503dc03ddd88691e230f2deda1936fe7bc5"} Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.158992 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-2c76-account-create-update-wvgzd" podStartSLOduration=3.158975574 podStartE2EDuration="3.158975574s" podCreationTimestamp="2026-03-12 18:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:32.1572663 +0000 UTC m=+1192.525892633" watchObservedRunningTime="2026-03-12 18:22:32.158975574 +0000 UTC m=+1192.527601907" Mar 12 18:22:32 crc kubenswrapper[4926]: I0312 18:22:32.192297 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-n28hs" podStartSLOduration=3.19228327 podStartE2EDuration="3.19228327s" podCreationTimestamp="2026-03-12 18:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:32.184318032 +0000 UTC m=+1192.552944365" watchObservedRunningTime="2026-03-12 18:22:32.19228327 +0000 UTC m=+1192.560909603" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.164711 4926 generic.go:334] "Generic (PLEG): container finished" podID="05959443-d099-4653-9736-7745ba1ce331" containerID="f7b35eaf50020882859b6b648c035a75d02c47868f155d967f93cf393c62f39e" exitCode=0 Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.164801 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n28hs" event={"ID":"05959443-d099-4653-9736-7745ba1ce331","Type":"ContainerDied","Data":"f7b35eaf50020882859b6b648c035a75d02c47868f155d967f93cf393c62f39e"} Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.169095 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7917312-5922-44df-a838-6b5452e8bb84" containerID="1a95a2291b71efd71b4d69e1bff55bb10de0a0cbc11d599a6fbdc537cf2a1c21" exitCode=0 Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.169189 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2c76-account-create-update-wvgzd" event={"ID":"f7917312-5922-44df-a838-6b5452e8bb84","Type":"ContainerDied","Data":"1a95a2291b71efd71b4d69e1bff55bb10de0a0cbc11d599a6fbdc537cf2a1c21"} Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.170749 4926 generic.go:334] "Generic (PLEG): container finished" podID="9e5bf33c-5cb6-4b29-95f4-fcf47183be58" containerID="ff84043d3f919db7adaecbb9d5f2f9199eabe47965d6033ec791be724e00a8f2" exitCode=0 Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.170800 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf50-account-create-update-ckzh6" event={"ID":"9e5bf33c-5cb6-4b29-95f4-fcf47183be58","Type":"ContainerDied","Data":"ff84043d3f919db7adaecbb9d5f2f9199eabe47965d6033ec791be724e00a8f2"} Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.173020 4926 generic.go:334] "Generic (PLEG): container finished" podID="ea314c69-7524-43d6-9d4f-9fdb16510952" containerID="7a2cc30c7cfe8ca6acf3b5029da578678f6497bf01186f5ff5a4caf50683498c" exitCode=0 Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.173106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6zk" event={"ID":"ea314c69-7524-43d6-9d4f-9fdb16510952","Type":"ContainerDied","Data":"7a2cc30c7cfe8ca6acf3b5029da578678f6497bf01186f5ff5a4caf50683498c"} Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.176024 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"0468589b613cbf08fc1bf01e9282b6c42aea8979704d41cdc9fd9d076d25db9d"} Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.708264 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.719868 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.740190 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849078 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-scripts\") pod \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849183 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pb5c\" (UniqueName: \"kubernetes.io/projected/6a1a6049-a924-4e9e-adad-e6ec84732eb9-kube-api-access-9pb5c\") pod \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849211 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-log-ovn\") pod \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849270 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rddff\" (UniqueName: \"kubernetes.io/projected/fa60a9bd-1933-44b8-ac17-be1f72c9da68-kube-api-access-rddff\") pod \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849297 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa60a9bd-1933-44b8-ac17-be1f72c9da68-operator-scripts\") pod \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\" (UID: \"fa60a9bd-1933-44b8-ac17-be1f72c9da68\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849312 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run\") pod \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849386 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drskx\" (UniqueName: \"kubernetes.io/projected/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-kube-api-access-drskx\") pod \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849409 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-additional-scripts\") pod \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849426 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run-ovn\") pod \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\" (UID: \"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.849467 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1a6049-a924-4e9e-adad-e6ec84732eb9-operator-scripts\") pod \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\" (UID: \"6a1a6049-a924-4e9e-adad-e6ec84732eb9\") " Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.850128 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa60a9bd-1933-44b8-ac17-be1f72c9da68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa60a9bd-1933-44b8-ac17-be1f72c9da68" (UID: "fa60a9bd-1933-44b8-ac17-be1f72c9da68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.850352 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a1a6049-a924-4e9e-adad-e6ec84732eb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a1a6049-a924-4e9e-adad-e6ec84732eb9" (UID: "6a1a6049-a924-4e9e-adad-e6ec84732eb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.851077 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" (UID: "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.850967 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-scripts" (OuterVolumeSpecName: "scripts") pod "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" (UID: "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.850556 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run" (OuterVolumeSpecName: "var-run") pod "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" (UID: "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.851194 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" (UID: "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.851573 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" (UID: "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.853307 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1a6049-a924-4e9e-adad-e6ec84732eb9-kube-api-access-9pb5c" (OuterVolumeSpecName: "kube-api-access-9pb5c") pod "6a1a6049-a924-4e9e-adad-e6ec84732eb9" (UID: "6a1a6049-a924-4e9e-adad-e6ec84732eb9"). InnerVolumeSpecName "kube-api-access-9pb5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.853622 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-kube-api-access-drskx" (OuterVolumeSpecName: "kube-api-access-drskx") pod "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" (UID: "1ce73342-b88b-4d1a-b6dc-8e7979f05e6d"). InnerVolumeSpecName "kube-api-access-drskx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.854020 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa60a9bd-1933-44b8-ac17-be1f72c9da68-kube-api-access-rddff" (OuterVolumeSpecName: "kube-api-access-rddff") pod "fa60a9bd-1933-44b8-ac17-be1f72c9da68" (UID: "fa60a9bd-1933-44b8-ac17-be1f72c9da68"). InnerVolumeSpecName "kube-api-access-rddff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950752 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drskx\" (UniqueName: \"kubernetes.io/projected/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-kube-api-access-drskx\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950804 4926 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950816 4926 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950827 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a1a6049-a924-4e9e-adad-e6ec84732eb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950836 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950845 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pb5c\" (UniqueName: \"kubernetes.io/projected/6a1a6049-a924-4e9e-adad-e6ec84732eb9-kube-api-access-9pb5c\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950853 4926 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950861 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rddff\" (UniqueName: \"kubernetes.io/projected/fa60a9bd-1933-44b8-ac17-be1f72c9da68-kube-api-access-rddff\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950869 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa60a9bd-1933-44b8-ac17-be1f72c9da68-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:33 crc kubenswrapper[4926]: I0312 18:22:33.950878 4926 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d-var-run\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.208365 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-40f3-account-create-update-6f5kz" event={"ID":"6a1a6049-a924-4e9e-adad-e6ec84732eb9","Type":"ContainerDied","Data":"7f60838faf0603a5159f5f3ca03ddcba8992db0a0a7cb8b0a847b3f2ddc0a013"} Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.208675 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f60838faf0603a5159f5f3ca03ddcba8992db0a0a7cb8b0a847b3f2ddc0a013" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.208389 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-40f3-account-create-update-6f5kz" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.211859 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-n9jj2" event={"ID":"fa60a9bd-1933-44b8-ac17-be1f72c9da68","Type":"ContainerDied","Data":"2b292237feadafee95b31e06cfc840f72c515e88b8454c8337e1e74e28cd7576"} Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.211915 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b292237feadafee95b31e06cfc840f72c515e88b8454c8337e1e74e28cd7576" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.211981 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-n9jj2" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.224181 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sfwpr-config-55drx" event={"ID":"1ce73342-b88b-4d1a-b6dc-8e7979f05e6d","Type":"ContainerDied","Data":"1b64556f5bf261422bd7a946c6906ac510a7650f9110c56570560ce6cb0ceff7"} Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.224229 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b64556f5bf261422bd7a946c6906ac510a7650f9110c56570560ce6cb0ceff7" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.224291 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sfwpr-config-55drx" Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.238275 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"f7925e6f8dd4e91bc33ea9de7de46827a465fec80fdc2c72f0e0c6ca7014c6b6"} Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.238691 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"98967e9369f3695e123659b5825995f560be830c73493e6998ebd0fb99612e49"} Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.811913 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sfwpr-config-55drx"] Mar 12 18:22:34 crc kubenswrapper[4926]: I0312 18:22:34.823984 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sfwpr-config-55drx"] Mar 12 18:22:36 crc kubenswrapper[4926]: I0312 18:22:36.503674 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" path="/var/lib/kubelet/pods/1ce73342-b88b-4d1a-b6dc-8e7979f05e6d/volumes" Mar 12 18:22:36 crc kubenswrapper[4926]: I0312 18:22:36.869217 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:36 crc kubenswrapper[4926]: I0312 18:22:36.913285 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:36 crc kubenswrapper[4926]: I0312 18:22:36.954303 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:36 crc kubenswrapper[4926]: I0312 18:22:36.983531 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.000720 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nh4b\" (UniqueName: \"kubernetes.io/projected/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-kube-api-access-8nh4b\") pod \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.001129 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r6dd\" (UniqueName: \"kubernetes.io/projected/f7917312-5922-44df-a838-6b5452e8bb84-kube-api-access-7r6dd\") pod \"f7917312-5922-44df-a838-6b5452e8bb84\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.002388 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917312-5922-44df-a838-6b5452e8bb84-operator-scripts\") pod \"f7917312-5922-44df-a838-6b5452e8bb84\" (UID: \"f7917312-5922-44df-a838-6b5452e8bb84\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.002455 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-operator-scripts\") pod \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\" (UID: \"9e5bf33c-5cb6-4b29-95f4-fcf47183be58\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.003737 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e5bf33c-5cb6-4b29-95f4-fcf47183be58" (UID: "9e5bf33c-5cb6-4b29-95f4-fcf47183be58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.004173 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7917312-5922-44df-a838-6b5452e8bb84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7917312-5922-44df-a838-6b5452e8bb84" (UID: "f7917312-5922-44df-a838-6b5452e8bb84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.006747 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7917312-5922-44df-a838-6b5452e8bb84-kube-api-access-7r6dd" (OuterVolumeSpecName: "kube-api-access-7r6dd") pod "f7917312-5922-44df-a838-6b5452e8bb84" (UID: "f7917312-5922-44df-a838-6b5452e8bb84"). InnerVolumeSpecName "kube-api-access-7r6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.006803 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-kube-api-access-8nh4b" (OuterVolumeSpecName: "kube-api-access-8nh4b") pod "9e5bf33c-5cb6-4b29-95f4-fcf47183be58" (UID: "9e5bf33c-5cb6-4b29-95f4-fcf47183be58"). InnerVolumeSpecName "kube-api-access-8nh4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.104030 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7tgz\" (UniqueName: \"kubernetes.io/projected/05959443-d099-4653-9736-7745ba1ce331-kube-api-access-w7tgz\") pod \"05959443-d099-4653-9736-7745ba1ce331\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.104131 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05959443-d099-4653-9736-7745ba1ce331-operator-scripts\") pod \"05959443-d099-4653-9736-7745ba1ce331\" (UID: \"05959443-d099-4653-9736-7745ba1ce331\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.104294 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jl7q\" (UniqueName: \"kubernetes.io/projected/ea314c69-7524-43d6-9d4f-9fdb16510952-kube-api-access-7jl7q\") pod \"ea314c69-7524-43d6-9d4f-9fdb16510952\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.104921 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea314c69-7524-43d6-9d4f-9fdb16510952-operator-scripts\") pod \"ea314c69-7524-43d6-9d4f-9fdb16510952\" (UID: \"ea314c69-7524-43d6-9d4f-9fdb16510952\") " Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.104841 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05959443-d099-4653-9736-7745ba1ce331-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05959443-d099-4653-9736-7745ba1ce331" (UID: "05959443-d099-4653-9736-7745ba1ce331"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.105795 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05959443-d099-4653-9736-7745ba1ce331-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.105955 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r6dd\" (UniqueName: \"kubernetes.io/projected/f7917312-5922-44df-a838-6b5452e8bb84-kube-api-access-7r6dd\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.106032 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7917312-5922-44df-a838-6b5452e8bb84-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.106100 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.106176 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nh4b\" (UniqueName: \"kubernetes.io/projected/9e5bf33c-5cb6-4b29-95f4-fcf47183be58-kube-api-access-8nh4b\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.105798 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea314c69-7524-43d6-9d4f-9fdb16510952-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea314c69-7524-43d6-9d4f-9fdb16510952" (UID: "ea314c69-7524-43d6-9d4f-9fdb16510952"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.107419 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea314c69-7524-43d6-9d4f-9fdb16510952-kube-api-access-7jl7q" (OuterVolumeSpecName: "kube-api-access-7jl7q") pod "ea314c69-7524-43d6-9d4f-9fdb16510952" (UID: "ea314c69-7524-43d6-9d4f-9fdb16510952"). InnerVolumeSpecName "kube-api-access-7jl7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.108869 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05959443-d099-4653-9736-7745ba1ce331-kube-api-access-w7tgz" (OuterVolumeSpecName: "kube-api-access-w7tgz") pod "05959443-d099-4653-9736-7745ba1ce331" (UID: "05959443-d099-4653-9736-7745ba1ce331"). InnerVolumeSpecName "kube-api-access-w7tgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.207280 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jl7q\" (UniqueName: \"kubernetes.io/projected/ea314c69-7524-43d6-9d4f-9fdb16510952-kube-api-access-7jl7q\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.207555 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea314c69-7524-43d6-9d4f-9fdb16510952-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.207620 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7tgz\" (UniqueName: \"kubernetes.io/projected/05959443-d099-4653-9736-7745ba1ce331-kube-api-access-w7tgz\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.266500 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"b030a2d014f400d50cb89a2eda000385e992f52900539bdd3c6a7b659a8269b2"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.266557 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"0ad8612950d373ec053303346d238f478f191d4b05be61aed64cadc630109fad"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.268975 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-n28hs" event={"ID":"05959443-d099-4653-9736-7745ba1ce331","Type":"ContainerDied","Data":"def54e7f3bca68561150fcadb909db84a99aaf5cfe2458f9499bbf1014d7ccf8"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.269008 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def54e7f3bca68561150fcadb909db84a99aaf5cfe2458f9499bbf1014d7ccf8" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.269012 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-n28hs" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.270927 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vtkq" event={"ID":"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d","Type":"ContainerStarted","Data":"7fd785662fcf1b6a3363e17fd1fb61ae49e121593b0c0273db1785eefe4c3db8"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.274749 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2c76-account-create-update-wvgzd" event={"ID":"f7917312-5922-44df-a838-6b5452e8bb84","Type":"ContainerDied","Data":"63fccee8b612554f7b094c76fecdac4ac35f19b95dc6b9a6784a3936c615d9a5"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.274910 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63fccee8b612554f7b094c76fecdac4ac35f19b95dc6b9a6784a3936c615d9a5" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.275006 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2c76-account-create-update-wvgzd" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.287849 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-bf50-account-create-update-ckzh6" event={"ID":"9e5bf33c-5cb6-4b29-95f4-fcf47183be58","Type":"ContainerDied","Data":"c56db79b42dcfa1fc516e7e174210c09707255662ee7602092c441dc8b95945d"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.287887 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56db79b42dcfa1fc516e7e174210c09707255662ee7602092c441dc8b95945d" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.287940 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-bf50-account-create-update-ckzh6" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.294281 4926 generic.go:334] "Generic (PLEG): container finished" podID="6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" containerID="f8a8e8e5fcda402042df2295b1c2cbd321aa245d9b7be946af5559093b56242a" exitCode=0 Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.294402 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-72msf" event={"ID":"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8","Type":"ContainerDied","Data":"f8a8e8e5fcda402042df2295b1c2cbd321aa245d9b7be946af5559093b56242a"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.296802 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6zk" event={"ID":"ea314c69-7524-43d6-9d4f-9fdb16510952","Type":"ContainerDied","Data":"c5f3bf41c45721fdd76098163bfbbd5fb8a4d739371fec426b54c95a8d7546f5"} Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.296834 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f3bf41c45721fdd76098163bfbbd5fb8a4d739371fec426b54c95a8d7546f5" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.300088 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6zk" Mar 12 18:22:37 crc kubenswrapper[4926]: I0312 18:22:37.310572 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-4vtkq" podStartSLOduration=3.358687545 podStartE2EDuration="8.310552681s" podCreationTimestamp="2026-03-12 18:22:29 +0000 UTC" firstStartedPulling="2026-03-12 18:22:31.739421746 +0000 UTC m=+1192.108048089" lastFinishedPulling="2026-03-12 18:22:36.691286892 +0000 UTC m=+1197.059913225" observedRunningTime="2026-03-12 18:22:37.304243645 +0000 UTC m=+1197.672869978" watchObservedRunningTime="2026-03-12 18:22:37.310552681 +0000 UTC m=+1197.679179044" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.653968 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-72msf" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.731862 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-combined-ca-bundle\") pod \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.731911 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-db-sync-config-data\") pod \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.731981 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-config-data\") pod \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.732080 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9kjm\" (UniqueName: \"kubernetes.io/projected/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-kube-api-access-q9kjm\") pod \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\" (UID: \"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8\") " Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.735743 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-kube-api-access-q9kjm" (OuterVolumeSpecName: "kube-api-access-q9kjm") pod "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" (UID: "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8"). InnerVolumeSpecName "kube-api-access-q9kjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.735984 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" (UID: "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.755675 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" (UID: "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.799273 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-config-data" (OuterVolumeSpecName: "config-data") pod "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" (UID: "6bad0b5a-a817-45ec-9ebb-7b30d7492ed8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.834142 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.834179 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9kjm\" (UniqueName: \"kubernetes.io/projected/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-kube-api-access-q9kjm\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.834191 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:38 crc kubenswrapper[4926]: I0312 18:22:38.834202 4926 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.331812 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-72msf" event={"ID":"6bad0b5a-a817-45ec-9ebb-7b30d7492ed8","Type":"ContainerDied","Data":"0b8074caa3b0c10eb9a072ab0513f4ac17eb3bec44cc3edb55302dc941d344c0"} Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.332093 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b8074caa3b0c10eb9a072ab0513f4ac17eb3bec44cc3edb55302dc941d344c0" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.332162 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-72msf" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.345546 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"dcd686948bb98223945c78239c5e2eade4b334fdb486f13f8183a786b11bcb77"} Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.345806 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"2c4216509b2e96a37dad4fb826f0c82a6f461387fbac81c65033023a3b846844"} Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.345877 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"2814dd88eb80db2962578d8009f8f05d069c5caf4e38d32a30827d4b23f367dc"} Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.798662 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s8pxk"] Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799231 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1a6049-a924-4e9e-adad-e6ec84732eb9" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799243 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1a6049-a924-4e9e-adad-e6ec84732eb9" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799260 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5bf33c-5cb6-4b29-95f4-fcf47183be58" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799266 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5bf33c-5cb6-4b29-95f4-fcf47183be58" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799277 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea314c69-7524-43d6-9d4f-9fdb16510952" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799284 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea314c69-7524-43d6-9d4f-9fdb16510952" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799294 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349ca4f5-349b-45ab-98a4-844fa00599d0" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799300 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="349ca4f5-349b-45ab-98a4-844fa00599d0" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799308 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7917312-5922-44df-a838-6b5452e8bb84" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799315 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7917312-5922-44df-a838-6b5452e8bb84" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799326 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05959443-d099-4653-9736-7745ba1ce331" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799331 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="05959443-d099-4653-9736-7745ba1ce331" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799342 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" containerName="ovn-config" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799347 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" containerName="ovn-config" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799362 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa60a9bd-1933-44b8-ac17-be1f72c9da68" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799368 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa60a9bd-1933-44b8-ac17-be1f72c9da68" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: E0312 18:22:39.799377 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" containerName="glance-db-sync" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.799383 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" containerName="glance-db-sync" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807497 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="349ca4f5-349b-45ab-98a4-844fa00599d0" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807542 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5bf33c-5cb6-4b29-95f4-fcf47183be58" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807558 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea314c69-7524-43d6-9d4f-9fdb16510952" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807573 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7917312-5922-44df-a838-6b5452e8bb84" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807587 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" containerName="glance-db-sync" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807596 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1a6049-a924-4e9e-adad-e6ec84732eb9" containerName="mariadb-account-create-update" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807616 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="05959443-d099-4653-9736-7745ba1ce331" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807632 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa60a9bd-1933-44b8-ac17-be1f72c9da68" containerName="mariadb-database-create" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.807641 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce73342-b88b-4d1a-b6dc-8e7979f05e6d" containerName="ovn-config" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.808926 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.819009 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s8pxk"] Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.952942 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-config\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.953047 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.953145 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.953241 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xrz5\" (UniqueName: \"kubernetes.io/projected/4f5dc206-65d8-4b00-a55d-b505d640a279-kube-api-access-9xrz5\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:39 crc kubenswrapper[4926]: I0312 18:22:39.953318 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.058240 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.058304 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-config\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.058326 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.058395 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.058469 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xrz5\" (UniqueName: \"kubernetes.io/projected/4f5dc206-65d8-4b00-a55d-b505d640a279-kube-api-access-9xrz5\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.059909 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.060412 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-config\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.060918 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.061465 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.076372 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xrz5\" (UniqueName: \"kubernetes.io/projected/4f5dc206-65d8-4b00-a55d-b505d640a279-kube-api-access-9xrz5\") pod \"dnsmasq-dns-5b946c75cc-s8pxk\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.127848 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.364120 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"3505cb804eb89707510f827ea55bf8c0b5498a603a4dc33a295007ca469842dc"} Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.364376 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"1661ebf70a682193744c016efe51c609283a83267b967202bbe346b08fdc327c"} Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.364386 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"35156daa5963895ee8abf10c0b8630bfb036d1245a6d593e55409209aabbf565"} Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.364396 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"57853681-32de-4475-9c7d-3f9708fe7d91","Type":"ContainerStarted","Data":"46bc1bb64318be0810542d3dd7cfbc08c1c22a5f45619d5edb2d27d87b4afabe"} Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.584876 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.789438171 podStartE2EDuration="48.584857654s" podCreationTimestamp="2026-03-12 18:21:52 +0000 UTC" firstStartedPulling="2026-03-12 18:22:28.50747937 +0000 UTC m=+1188.876105703" lastFinishedPulling="2026-03-12 18:22:38.302898843 +0000 UTC m=+1198.671525186" observedRunningTime="2026-03-12 18:22:40.407765026 +0000 UTC m=+1200.776391359" watchObservedRunningTime="2026-03-12 18:22:40.584857654 +0000 UTC m=+1200.953483987" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.587095 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s8pxk"] Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.673373 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s8pxk"] Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.713336 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tvrln"] Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.714994 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.717203 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.733967 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tvrln"] Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.766955 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.766998 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dg8\" (UniqueName: \"kubernetes.io/projected/69fa4317-bafb-462d-b6aa-ea07437277f1-kube-api-access-p4dg8\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.767066 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.767113 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.767155 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.767180 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-config\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.869181 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.869262 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.869315 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-config\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.869451 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.869490 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dg8\" (UniqueName: \"kubernetes.io/projected/69fa4317-bafb-462d-b6aa-ea07437277f1-kube-api-access-p4dg8\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.869530 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.870886 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.870897 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.870951 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.870890 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.871028 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-config\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:40 crc kubenswrapper[4926]: I0312 18:22:40.889038 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dg8\" (UniqueName: \"kubernetes.io/projected/69fa4317-bafb-462d-b6aa-ea07437277f1-kube-api-access-p4dg8\") pod \"dnsmasq-dns-74f6bcbc87-tvrln\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:41 crc kubenswrapper[4926]: I0312 18:22:41.072139 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:41 crc kubenswrapper[4926]: I0312 18:22:41.372712 4926 generic.go:334] "Generic (PLEG): container finished" podID="4f5dc206-65d8-4b00-a55d-b505d640a279" containerID="be98fd8613351be05b0464caa0d08812f4423f1da0f0227e98091b428e4425a5" exitCode=0 Mar 12 18:22:41 crc kubenswrapper[4926]: I0312 18:22:41.372793 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" event={"ID":"4f5dc206-65d8-4b00-a55d-b505d640a279","Type":"ContainerDied","Data":"be98fd8613351be05b0464caa0d08812f4423f1da0f0227e98091b428e4425a5"} Mar 12 18:22:41 crc kubenswrapper[4926]: I0312 18:22:41.373025 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" event={"ID":"4f5dc206-65d8-4b00-a55d-b505d640a279","Type":"ContainerStarted","Data":"62587664fe08f5a1f36b2ec950757fbec287bd8897e920da359c13c46fb57219"} Mar 12 18:22:41 crc kubenswrapper[4926]: I0312 18:22:41.374597 4926 generic.go:334] "Generic (PLEG): container finished" podID="d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" containerID="7fd785662fcf1b6a3363e17fd1fb61ae49e121593b0c0273db1785eefe4c3db8" exitCode=0 Mar 12 18:22:41 crc kubenswrapper[4926]: I0312 18:22:41.374666 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vtkq" event={"ID":"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d","Type":"ContainerDied","Data":"7fd785662fcf1b6a3363e17fd1fb61ae49e121593b0c0273db1785eefe4c3db8"} Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.051702 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.119393 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tvrln"] Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.124354 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xrz5\" (UniqueName: \"kubernetes.io/projected/4f5dc206-65d8-4b00-a55d-b505d640a279-kube-api-access-9xrz5\") pod \"4f5dc206-65d8-4b00-a55d-b505d640a279\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.124478 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-sb\") pod \"4f5dc206-65d8-4b00-a55d-b505d640a279\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.124523 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-nb\") pod \"4f5dc206-65d8-4b00-a55d-b505d640a279\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.124542 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-dns-svc\") pod \"4f5dc206-65d8-4b00-a55d-b505d640a279\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.124575 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-config\") pod \"4f5dc206-65d8-4b00-a55d-b505d640a279\" (UID: \"4f5dc206-65d8-4b00-a55d-b505d640a279\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.132043 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5dc206-65d8-4b00-a55d-b505d640a279-kube-api-access-9xrz5" (OuterVolumeSpecName: "kube-api-access-9xrz5") pod "4f5dc206-65d8-4b00-a55d-b505d640a279" (UID: "4f5dc206-65d8-4b00-a55d-b505d640a279"). InnerVolumeSpecName "kube-api-access-9xrz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.146607 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f5dc206-65d8-4b00-a55d-b505d640a279" (UID: "4f5dc206-65d8-4b00-a55d-b505d640a279"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.148243 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-config" (OuterVolumeSpecName: "config") pod "4f5dc206-65d8-4b00-a55d-b505d640a279" (UID: "4f5dc206-65d8-4b00-a55d-b505d640a279"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.150570 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f5dc206-65d8-4b00-a55d-b505d640a279" (UID: "4f5dc206-65d8-4b00-a55d-b505d640a279"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.155265 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f5dc206-65d8-4b00-a55d-b505d640a279" (UID: "4f5dc206-65d8-4b00-a55d-b505d640a279"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.232486 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.232525 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.232540 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.232553 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5dc206-65d8-4b00-a55d-b505d640a279-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.232567 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xrz5\" (UniqueName: \"kubernetes.io/projected/4f5dc206-65d8-4b00-a55d-b505d640a279-kube-api-access-9xrz5\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.387666 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.387673 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-s8pxk" event={"ID":"4f5dc206-65d8-4b00-a55d-b505d640a279","Type":"ContainerDied","Data":"62587664fe08f5a1f36b2ec950757fbec287bd8897e920da359c13c46fb57219"} Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.387826 4926 scope.go:117] "RemoveContainer" containerID="be98fd8613351be05b0464caa0d08812f4423f1da0f0227e98091b428e4425a5" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.391585 4926 generic.go:334] "Generic (PLEG): container finished" podID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerID="fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b" exitCode=0 Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.392619 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" event={"ID":"69fa4317-bafb-462d-b6aa-ea07437277f1","Type":"ContainerDied","Data":"fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b"} Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.392656 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" event={"ID":"69fa4317-bafb-462d-b6aa-ea07437277f1","Type":"ContainerStarted","Data":"a0cec62e2c7e846902634b014c1304ee38e6c7bc3f234649898d90b9aefa0ac7"} Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.492373 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s8pxk"] Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.504130 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-s8pxk"] Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.725289 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.843993 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-config-data\") pod \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.844165 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8k85\" (UniqueName: \"kubernetes.io/projected/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-kube-api-access-w8k85\") pod \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.844290 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-combined-ca-bundle\") pod \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\" (UID: \"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d\") " Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.847542 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-kube-api-access-w8k85" (OuterVolumeSpecName: "kube-api-access-w8k85") pod "d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" (UID: "d7b6b9d3-0c3c-4dcc-b417-49d53269c39d"). InnerVolumeSpecName "kube-api-access-w8k85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.868194 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" (UID: "d7b6b9d3-0c3c-4dcc-b417-49d53269c39d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.885765 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-config-data" (OuterVolumeSpecName: "config-data") pod "d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" (UID: "d7b6b9d3-0c3c-4dcc-b417-49d53269c39d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.946599 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.946639 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8k85\" (UniqueName: \"kubernetes.io/projected/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-kube-api-access-w8k85\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:42 crc kubenswrapper[4926]: I0312 18:22:42.946657 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.402036 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" event={"ID":"69fa4317-bafb-462d-b6aa-ea07437277f1","Type":"ContainerStarted","Data":"9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee"} Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.403267 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.412157 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-4vtkq" event={"ID":"d7b6b9d3-0c3c-4dcc-b417-49d53269c39d","Type":"ContainerDied","Data":"7c7e1f1cb85b710df8d9fc682c9ab5a089b0604e2b56bc8fd5036a48bcce3088"} Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.412197 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c7e1f1cb85b710df8d9fc682c9ab5a089b0604e2b56bc8fd5036a48bcce3088" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.412223 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-4vtkq" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.435801 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" podStartSLOduration=3.4357776700000002 podStartE2EDuration="3.43577767s" podCreationTimestamp="2026-03-12 18:22:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:43.430453844 +0000 UTC m=+1203.799080177" watchObservedRunningTime="2026-03-12 18:22:43.43577767 +0000 UTC m=+1203.804404003" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.682565 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tvrln"] Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.717873 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hrv24"] Mar 12 18:22:43 crc kubenswrapper[4926]: E0312 18:22:43.718486 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" containerName="keystone-db-sync" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.718569 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" containerName="keystone-db-sync" Mar 12 18:22:43 crc kubenswrapper[4926]: E0312 18:22:43.718642 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5dc206-65d8-4b00-a55d-b505d640a279" containerName="init" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.718716 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5dc206-65d8-4b00-a55d-b505d640a279" containerName="init" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.718968 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" containerName="keystone-db-sync" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.719050 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5dc206-65d8-4b00-a55d-b505d640a279" containerName="init" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.725772 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.732614 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.732951 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.733173 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vrsc7" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.733246 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.733466 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.807224 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-6j58l"] Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.808474 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.828533 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hrv24"] Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.841613 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-6j58l"] Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.877315 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-config-data\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.877399 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-combined-ca-bundle\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.877429 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-fernet-keys\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.877461 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-credential-keys\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.877486 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpnk7\" (UniqueName: \"kubernetes.io/projected/fdce3e2c-d108-4904-b4ab-8d95d4838bad-kube-api-access-rpnk7\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.877628 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-scripts\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980604 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-fernet-keys\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980655 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-credential-keys\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980678 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpnk7\" (UniqueName: \"kubernetes.io/projected/fdce3e2c-d108-4904-b4ab-8d95d4838bad-kube-api-access-rpnk7\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980720 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-config\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980792 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-scripts\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980816 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980868 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-config-data\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980908 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980936 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-svc\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980970 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-combined-ca-bundle\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.980992 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:43 crc kubenswrapper[4926]: I0312 18:22:43.981016 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sftv\" (UniqueName: \"kubernetes.io/projected/c72142af-cd95-4044-92d7-ce173109afa7-kube-api-access-4sftv\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.007132 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-credential-keys\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.007845 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-fernet-keys\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.008329 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-config-data\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.021993 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7786dc65fc-n6hp6"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.023233 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.031164 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-combined-ca-bundle\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.031387 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-scripts\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.032620 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.032865 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.033680 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.048706 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4vqq4" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.053574 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7786dc65fc-n6hp6"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.093245 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-config\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.093332 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.093390 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.093411 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-svc\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.093447 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.093465 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sftv\" (UniqueName: \"kubernetes.io/projected/c72142af-cd95-4044-92d7-ce173109afa7-kube-api-access-4sftv\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.094584 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-config\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.095100 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.095597 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.096110 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-svc\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.103157 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.103934 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.118973 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpnk7\" (UniqueName: \"kubernetes.io/projected/fdce3e2c-d108-4904-b4ab-8d95d4838bad-kube-api-access-rpnk7\") pod \"keystone-bootstrap-hrv24\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.133885 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.153596 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.153781 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.172210 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.192146 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sftv\" (UniqueName: \"kubernetes.io/projected/c72142af-cd95-4044-92d7-ce173109afa7-kube-api-access-4sftv\") pod \"dnsmasq-dns-847c4cc679-6j58l\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.195369 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c2a8323-527c-4e19-ab55-3c291c4d538f-logs\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.195422 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c2a8323-527c-4e19-ab55-3c291c4d538f-horizon-secret-key\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.195463 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-scripts\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.195512 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-config-data\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.195601 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flm2j\" (UniqueName: \"kubernetes.io/projected/8c2a8323-527c-4e19-ab55-3c291c4d538f-kube-api-access-flm2j\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.235620 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vc5cr"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.236988 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.245005 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.253999 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.254251 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qpbhw" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.254751 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-sgnbr"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.255761 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.271106 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hz28b" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.271359 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.296486 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vc5cr"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297351 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c2a8323-527c-4e19-ab55-3c291c4d538f-logs\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297390 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-scripts\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297407 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-config-data\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297449 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c2a8323-527c-4e19-ab55-3c291c4d538f-horizon-secret-key\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297471 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-scripts\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297496 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrs2\" (UniqueName: \"kubernetes.io/projected/5fdfffa4-937c-4167-8545-d34f2007fbc9-kube-api-access-smrs2\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297526 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-config-data\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297542 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297562 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297583 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flm2j\" (UniqueName: \"kubernetes.io/projected/8c2a8323-527c-4e19-ab55-3c291c4d538f-kube-api-access-flm2j\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297614 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-run-httpd\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297662 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-log-httpd\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.297922 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c2a8323-527c-4e19-ab55-3c291c4d538f-logs\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.298514 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-scripts\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.299473 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-config-data\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.327526 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-txt96"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.328616 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.340399 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c2a8323-527c-4e19-ab55-3c291c4d538f-horizon-secret-key\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.340616 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.349170 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.352299 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flm2j\" (UniqueName: \"kubernetes.io/projected/8c2a8323-527c-4e19-ab55-3c291c4d538f-kube-api-access-flm2j\") pod \"horizon-7786dc65fc-n6hp6\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.352414 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sgnbr"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.360726 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.367955 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9lh8l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.384708 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-txt96"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402697 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-combined-ca-bundle\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402765 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df6kk\" (UniqueName: \"kubernetes.io/projected/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-kube-api-access-df6kk\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402787 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-config-data\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402805 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgd7q\" (UniqueName: \"kubernetes.io/projected/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-kube-api-access-tgd7q\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402826 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-db-sync-config-data\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402846 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-config\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402868 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-combined-ca-bundle\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402883 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-scripts\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402902 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-scripts\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402919 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-config-data\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402951 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mklss\" (UniqueName: \"kubernetes.io/projected/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-kube-api-access-mklss\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.402980 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrs2\" (UniqueName: \"kubernetes.io/projected/5fdfffa4-937c-4167-8545-d34f2007fbc9-kube-api-access-smrs2\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403016 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403040 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403062 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-combined-ca-bundle\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403091 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-logs\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403112 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-run-httpd\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403130 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-log-httpd\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.403552 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-log-httpd\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.413230 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-run-httpd\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.418017 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lgvzs"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.419082 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.426970 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.427245 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.427511 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m6xv4" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.429353 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.432024 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.433280 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-scripts\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.441920 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-config-data\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.449803 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.473400 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.481411 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84786596b5-28lbk"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.482732 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.508830 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqfm\" (UniqueName: \"kubernetes.io/projected/dac4b5d6-fb31-4955-8679-db9d3ff63c10-kube-api-access-fgqfm\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.508869 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mklss\" (UniqueName: \"kubernetes.io/projected/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-kube-api-access-mklss\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.508894 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-config-data\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.508917 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-combined-ca-bundle\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.508956 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-combined-ca-bundle\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.508976 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-scripts\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509000 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-logs\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509037 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-combined-ca-bundle\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509055 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dac4b5d6-fb31-4955-8679-db9d3ff63c10-etc-machine-id\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509078 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df6kk\" (UniqueName: \"kubernetes.io/projected/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-kube-api-access-df6kk\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509093 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-config-data\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509111 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-db-sync-config-data\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509128 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgd7q\" (UniqueName: \"kubernetes.io/projected/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-kube-api-access-tgd7q\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509148 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-db-sync-config-data\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509169 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-config\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509185 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-combined-ca-bundle\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509202 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-scripts\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.509206 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrs2\" (UniqueName: \"kubernetes.io/projected/5fdfffa4-937c-4167-8545-d34f2007fbc9-kube-api-access-smrs2\") pod \"ceilometer-0\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.512239 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-logs\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.520081 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-db-sync-config-data\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.525341 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5dc206-65d8-4b00-a55d-b505d640a279" path="/var/lib/kubelet/pods/4f5dc206-65d8-4b00-a55d-b505d640a279/volumes" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.533672 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-scripts\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.534865 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-combined-ca-bundle\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.537362 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgd7q\" (UniqueName: \"kubernetes.io/projected/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-kube-api-access-tgd7q\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.538025 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-config-data\") pod \"placement-db-sync-txt96\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.538674 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-config\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.541079 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-combined-ca-bundle\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.546050 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-combined-ca-bundle\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.551039 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-6j58l"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.556800 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mklss\" (UniqueName: \"kubernetes.io/projected/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-kube-api-access-mklss\") pod \"barbican-db-sync-sgnbr\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.572950 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df6kk\" (UniqueName: \"kubernetes.io/projected/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-kube-api-access-df6kk\") pod \"neutron-db-sync-vc5cr\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.601156 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lgvzs"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.610629 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84786596b5-28lbk"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611427 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7bxq\" (UniqueName: \"kubernetes.io/projected/1ca9ea6f-908c-48db-9313-c3ff4809a993-kube-api-access-p7bxq\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611486 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca9ea6f-908c-48db-9313-c3ff4809a993-logs\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611521 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqfm\" (UniqueName: \"kubernetes.io/projected/dac4b5d6-fb31-4955-8679-db9d3ff63c10-kube-api-access-fgqfm\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611544 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-scripts\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611615 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-config-data\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611641 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-combined-ca-bundle\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611667 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-config-data\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611733 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-scripts\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611823 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ca9ea6f-908c-48db-9313-c3ff4809a993-horizon-secret-key\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611889 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dac4b5d6-fb31-4955-8679-db9d3ff63c10-etc-machine-id\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.611940 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-db-sync-config-data\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.613865 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dac4b5d6-fb31-4955-8679-db9d3ff63c10-etc-machine-id\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.615486 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-db-sync-config-data\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.619080 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-combined-ca-bundle\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.622392 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-config-data\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.622947 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-scripts\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.629955 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-7pqqh"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.632341 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.672400 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqfm\" (UniqueName: \"kubernetes.io/projected/dac4b5d6-fb31-4955-8679-db9d3ff63c10-kube-api-access-fgqfm\") pod \"cinder-db-sync-lgvzs\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.704968 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-7pqqh"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.717048 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-txt96" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.717248 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718748 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ca9ea6f-908c-48db-9313-c3ff4809a993-horizon-secret-key\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718801 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718849 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7bxq\" (UniqueName: \"kubernetes.io/projected/1ca9ea6f-908c-48db-9313-c3ff4809a993-kube-api-access-p7bxq\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718882 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718921 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca9ea6f-908c-48db-9313-c3ff4809a993-logs\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718945 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-scripts\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.718989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.719013 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.719028 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.719051 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlwc\" (UniqueName: \"kubernetes.io/projected/af5b15ef-fa73-4ef4-9235-41da81503d2c-kube-api-access-gzlwc\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.719071 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-config-data\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.719806 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca9ea6f-908c-48db-9313-c3ff4809a993-logs\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.720639 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-config-data\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.720884 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.724277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-scripts\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.728201 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ca9ea6f-908c-48db-9313-c3ff4809a993-horizon-secret-key\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.731937 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.735318 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.737810 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2zfbp" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.738084 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.759044 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.760229 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.762375 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7bxq\" (UniqueName: \"kubernetes.io/projected/1ca9ea6f-908c-48db-9313-c3ff4809a993-kube-api-access-p7bxq\") pod \"horizon-84786596b5-28lbk\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.773774 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.776356 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.805021 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.816779 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.818461 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.821108 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.821176 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.821229 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.821247 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.821476 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.821586 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlwc\" (UniqueName: \"kubernetes.io/projected/af5b15ef-fa73-4ef4-9235-41da81503d2c-kube-api-access-gzlwc\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.822616 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.823839 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.825338 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.827004 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.827180 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.828186 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.830210 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.837853 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.841136 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.848049 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlwc\" (UniqueName: \"kubernetes.io/projected/af5b15ef-fa73-4ef4-9235-41da81503d2c-kube-api-access-gzlwc\") pod \"dnsmasq-dns-785d8bcb8c-7pqqh\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924559 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924608 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924649 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924669 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-logs\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924711 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924730 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn68d\" (UniqueName: \"kubernetes.io/projected/7a40ca3d-194d-4baa-8417-3614c8aeef08-kube-api-access-xn68d\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924764 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924783 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924814 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924845 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924886 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924915 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924947 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2njd\" (UniqueName: \"kubernetes.io/projected/2900c0f0-0ed1-493a-8bad-ed203682251d-kube-api-access-z2njd\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.924963 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.925636 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:44 crc kubenswrapper[4926]: I0312 18:22:44.925712 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.029775 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030252 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-logs\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030280 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030294 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn68d\" (UniqueName: \"kubernetes.io/projected/7a40ca3d-194d-4baa-8417-3614c8aeef08-kube-api-access-xn68d\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030315 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030333 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030381 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030428 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030505 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030537 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030579 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2njd\" (UniqueName: \"kubernetes.io/projected/2900c0f0-0ed1-493a-8bad-ed203682251d-kube-api-access-z2njd\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030602 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030637 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030664 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030690 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030714 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.030932 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-logs\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.031184 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.031521 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.031788 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.032039 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.046237 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-config-data\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.047063 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.047746 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.048031 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-scripts\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.048094 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.052413 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.053430 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn68d\" (UniqueName: \"kubernetes.io/projected/7a40ca3d-194d-4baa-8417-3614c8aeef08-kube-api-access-xn68d\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.055688 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.057418 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.064798 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.069571 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.071298 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2njd\" (UniqueName: \"kubernetes.io/projected/2900c0f0-0ed1-493a-8bad-ed203682251d-kube-api-access-z2njd\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.081077 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.083430 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.086052 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-6j58l"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.154242 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.227470 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hrv24"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.259226 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7786dc65fc-n6hp6"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.376133 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.458826 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrv24" event={"ID":"fdce3e2c-d108-4904-b4ab-8d95d4838bad","Type":"ContainerStarted","Data":"334e13044642c4b0a613f2bfccdc8e4e0067f4f734093e62a567052168539e3c"} Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.459150 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrv24" event={"ID":"fdce3e2c-d108-4904-b4ab-8d95d4838bad","Type":"ContainerStarted","Data":"cbd223739736371712c30899d1265bcdf31be415ef718a7391e85bb96801c5f5"} Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.460995 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7786dc65fc-n6hp6" event={"ID":"8c2a8323-527c-4e19-ab55-3c291c4d538f","Type":"ContainerStarted","Data":"1a2407e300165d17a9ee94f4b0ab357bc8d93ae3d9e96138519ce22a7544a84a"} Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.462967 4926 generic.go:334] "Generic (PLEG): container finished" podID="c72142af-cd95-4044-92d7-ce173109afa7" containerID="05f95959f0e95ab682be03e1080674ba4ddff508d0c0201c65a7316a98a66c30" exitCode=0 Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.463172 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerName="dnsmasq-dns" containerID="cri-o://9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee" gracePeriod=10 Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.463611 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" event={"ID":"c72142af-cd95-4044-92d7-ce173109afa7","Type":"ContainerDied","Data":"05f95959f0e95ab682be03e1080674ba4ddff508d0c0201c65a7316a98a66c30"} Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.463651 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" event={"ID":"c72142af-cd95-4044-92d7-ce173109afa7","Type":"ContainerStarted","Data":"cac6ba4747f07ad7b159a2c64c5d5c743305d616dd8e88a4a78b6bbef0d66fcd"} Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.482250 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hrv24" podStartSLOduration=2.482226876 podStartE2EDuration="2.482226876s" podCreationTimestamp="2026-03-12 18:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:45.479236353 +0000 UTC m=+1205.847862686" watchObservedRunningTime="2026-03-12 18:22:45.482226876 +0000 UTC m=+1205.850853219" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.715651 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vc5cr"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.743073 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-sgnbr"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.752566 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84786596b5-28lbk"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.764246 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lgvzs"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.774240 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-txt96"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.812776 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.821718 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-7pqqh"] Mar 12 18:22:45 crc kubenswrapper[4926]: W0312 18:22:45.823974 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf5b15ef_fa73_4ef4_9235_41da81503d2c.slice/crio-c327e0d0fbf885a0bb6f202d846326d2682c20d15307d1ed071e6aa3068c47c0 WatchSource:0}: Error finding container c327e0d0fbf885a0bb6f202d846326d2682c20d15307d1ed071e6aa3068c47c0: Status 404 returned error can't find the container with id c327e0d0fbf885a0bb6f202d846326d2682c20d15307d1ed071e6aa3068c47c0 Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.910297 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.966580 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-svc\") pod \"c72142af-cd95-4044-92d7-ce173109afa7\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.966738 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-sb\") pod \"c72142af-cd95-4044-92d7-ce173109afa7\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.966830 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sftv\" (UniqueName: \"kubernetes.io/projected/c72142af-cd95-4044-92d7-ce173109afa7-kube-api-access-4sftv\") pod \"c72142af-cd95-4044-92d7-ce173109afa7\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.966982 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-config\") pod \"c72142af-cd95-4044-92d7-ce173109afa7\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.968799 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-swift-storage-0\") pod \"c72142af-cd95-4044-92d7-ce173109afa7\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.968854 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-nb\") pod \"c72142af-cd95-4044-92d7-ce173109afa7\" (UID: \"c72142af-cd95-4044-92d7-ce173109afa7\") " Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.993811 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72142af-cd95-4044-92d7-ce173109afa7-kube-api-access-4sftv" (OuterVolumeSpecName: "kube-api-access-4sftv") pod "c72142af-cd95-4044-92d7-ce173109afa7" (UID: "c72142af-cd95-4044-92d7-ce173109afa7"). InnerVolumeSpecName "kube-api-access-4sftv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:45 crc kubenswrapper[4926]: I0312 18:22:45.996299 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c72142af-cd95-4044-92d7-ce173109afa7" (UID: "c72142af-cd95-4044-92d7-ce173109afa7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.011191 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c72142af-cd95-4044-92d7-ce173109afa7" (UID: "c72142af-cd95-4044-92d7-ce173109afa7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.027244 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-config" (OuterVolumeSpecName: "config") pod "c72142af-cd95-4044-92d7-ce173109afa7" (UID: "c72142af-cd95-4044-92d7-ce173109afa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.040759 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c72142af-cd95-4044-92d7-ce173109afa7" (UID: "c72142af-cd95-4044-92d7-ce173109afa7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.060521 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c72142af-cd95-4044-92d7-ce173109afa7" (UID: "c72142af-cd95-4044-92d7-ce173109afa7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.077730 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.077770 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.077784 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sftv\" (UniqueName: \"kubernetes.io/projected/c72142af-cd95-4044-92d7-ce173109afa7-kube-api-access-4sftv\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.077794 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.077802 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.077810 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c72142af-cd95-4044-92d7-ce173109afa7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.167468 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.257081 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7786dc65fc-n6hp6"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.283339 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-749fd9494c-gtzjf"] Mar 12 18:22:46 crc kubenswrapper[4926]: E0312 18:22:46.283679 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72142af-cd95-4044-92d7-ce173109afa7" containerName="init" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.283696 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72142af-cd95-4044-92d7-ce173109afa7" containerName="init" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.283847 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72142af-cd95-4044-92d7-ce173109afa7" containerName="init" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.284698 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.294752 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.306761 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-749fd9494c-gtzjf"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.326805 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.365519 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.387108 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-nb\") pod \"69fa4317-bafb-462d-b6aa-ea07437277f1\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389089 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-svc\") pod \"69fa4317-bafb-462d-b6aa-ea07437277f1\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389135 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-swift-storage-0\") pod \"69fa4317-bafb-462d-b6aa-ea07437277f1\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389172 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-sb\") pod \"69fa4317-bafb-462d-b6aa-ea07437277f1\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389248 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-config\") pod \"69fa4317-bafb-462d-b6aa-ea07437277f1\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389287 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dg8\" (UniqueName: \"kubernetes.io/projected/69fa4317-bafb-462d-b6aa-ea07437277f1-kube-api-access-p4dg8\") pod \"69fa4317-bafb-462d-b6aa-ea07437277f1\" (UID: \"69fa4317-bafb-462d-b6aa-ea07437277f1\") " Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389710 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-config-data\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389776 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-scripts\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.389856 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04df6e00-4552-471c-9ae0-f45362e7e2b4-logs\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.390008 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04df6e00-4552-471c-9ae0-f45362e7e2b4-horizon-secret-key\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.390036 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g599\" (UniqueName: \"kubernetes.io/projected/04df6e00-4552-471c-9ae0-f45362e7e2b4-kube-api-access-8g599\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.399634 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69fa4317-bafb-462d-b6aa-ea07437277f1-kube-api-access-p4dg8" (OuterVolumeSpecName: "kube-api-access-p4dg8") pod "69fa4317-bafb-462d-b6aa-ea07437277f1" (UID: "69fa4317-bafb-462d-b6aa-ea07437277f1"). InnerVolumeSpecName "kube-api-access-p4dg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.437309 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.449472 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.493212 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04df6e00-4552-471c-9ae0-f45362e7e2b4-horizon-secret-key\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.493656 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g599\" (UniqueName: \"kubernetes.io/projected/04df6e00-4552-471c-9ae0-f45362e7e2b4-kube-api-access-8g599\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.493800 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-config-data\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.493859 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-scripts\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.493940 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04df6e00-4552-471c-9ae0-f45362e7e2b4-logs\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.494072 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dg8\" (UniqueName: \"kubernetes.io/projected/69fa4317-bafb-462d-b6aa-ea07437277f1-kube-api-access-p4dg8\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.494676 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04df6e00-4552-471c-9ae0-f45362e7e2b4-logs\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.497198 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-config-data\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.497981 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-scripts\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.513370 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04df6e00-4552-471c-9ae0-f45362e7e2b4-horizon-secret-key\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.519375 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g599\" (UniqueName: \"kubernetes.io/projected/04df6e00-4552-471c-9ae0-f45362e7e2b4-kube-api-access-8g599\") pod \"horizon-749fd9494c-gtzjf\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536092 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgvzs" event={"ID":"dac4b5d6-fb31-4955-8679-db9d3ff63c10","Type":"ContainerStarted","Data":"1b929dd828244692447588d25c4d9ac5aab4e983cee4fcf89d743277f69d1b5d"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536141 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84786596b5-28lbk" event={"ID":"1ca9ea6f-908c-48db-9313-c3ff4809a993","Type":"ContainerStarted","Data":"87827b8a20d43f6415fa4987b8da56ced353164794c7d29438fb6ffa36394a39"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536156 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2900c0f0-0ed1-493a-8bad-ed203682251d","Type":"ContainerStarted","Data":"7a86c1cac56bc2792ca8a01485262ffdadcde243619bf97536386064fe81ed59"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536170 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" event={"ID":"af5b15ef-fa73-4ef4-9235-41da81503d2c","Type":"ContainerStarted","Data":"c327e0d0fbf885a0bb6f202d846326d2682c20d15307d1ed071e6aa3068c47c0"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536182 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sgnbr" event={"ID":"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6","Type":"ContainerStarted","Data":"89660f246e3c30bdbd43247de70865e18df164fd93b36ad1830555d68b2f5032"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536194 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vc5cr" event={"ID":"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b","Type":"ContainerStarted","Data":"d40939b357197123b7378525a59c3f91e7240cb30140873883a85a0672ad25de"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.536206 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vc5cr" event={"ID":"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b","Type":"ContainerStarted","Data":"85be7b1fc6b389d0d4f2769e75e7678715b75c3e095cab19a3e0aad9bb5656b7"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.539547 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerStarted","Data":"fde586bc97eba127841e218475be2e383628bdf6feeb7595fc6e0b0daaf96628"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.546874 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-txt96" event={"ID":"7a0f2830-bf50-4195-9dad-d4d2c9529ee9","Type":"ContainerStarted","Data":"ce15d4b03307ac2a9a7f3c1d05eba2559878bf699c0483537cc9d838c8197125"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.552807 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" event={"ID":"c72142af-cd95-4044-92d7-ce173109afa7","Type":"ContainerDied","Data":"cac6ba4747f07ad7b159a2c64c5d5c743305d616dd8e88a4a78b6bbef0d66fcd"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.552863 4926 scope.go:117] "RemoveContainer" containerID="05f95959f0e95ab682be03e1080674ba4ddff508d0c0201c65a7316a98a66c30" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.552983 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-6j58l" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.562448 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a40ca3d-194d-4baa-8417-3614c8aeef08","Type":"ContainerStarted","Data":"d093554ce00472559ced5d4b721fe022998a063b108c7d7113b402e4a98bca54"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.568257 4926 generic.go:334] "Generic (PLEG): container finished" podID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerID="9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee" exitCode=0 Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.568515 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" event={"ID":"69fa4317-bafb-462d-b6aa-ea07437277f1","Type":"ContainerDied","Data":"9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.568567 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" event={"ID":"69fa4317-bafb-462d-b6aa-ea07437277f1","Type":"ContainerDied","Data":"a0cec62e2c7e846902634b014c1304ee38e6c7bc3f234649898d90b9aefa0ac7"} Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.568530 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-tvrln" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.568806 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vc5cr" podStartSLOduration=2.5687593680000003 podStartE2EDuration="2.568759368s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:46.562903705 +0000 UTC m=+1206.931530038" watchObservedRunningTime="2026-03-12 18:22:46.568759368 +0000 UTC m=+1206.937385691" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.613686 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.650377 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-6j58l"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.668232 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-6j58l"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.677087 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69fa4317-bafb-462d-b6aa-ea07437277f1" (UID: "69fa4317-bafb-462d-b6aa-ea07437277f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.691986 4926 scope.go:117] "RemoveContainer" containerID="9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.692538 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69fa4317-bafb-462d-b6aa-ea07437277f1" (UID: "69fa4317-bafb-462d-b6aa-ea07437277f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.704868 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.704890 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.706262 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-config" (OuterVolumeSpecName: "config") pod "69fa4317-bafb-462d-b6aa-ea07437277f1" (UID: "69fa4317-bafb-462d-b6aa-ea07437277f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.718891 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69fa4317-bafb-462d-b6aa-ea07437277f1" (UID: "69fa4317-bafb-462d-b6aa-ea07437277f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.719231 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69fa4317-bafb-462d-b6aa-ea07437277f1" (UID: "69fa4317-bafb-462d-b6aa-ea07437277f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.737224 4926 scope.go:117] "RemoveContainer" containerID="fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.807468 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.807498 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.807510 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69fa4317-bafb-462d-b6aa-ea07437277f1-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.818744 4926 scope.go:117] "RemoveContainer" containerID="9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee" Mar 12 18:22:46 crc kubenswrapper[4926]: E0312 18:22:46.826028 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee\": container with ID starting with 9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee not found: ID does not exist" containerID="9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.826080 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee"} err="failed to get container status \"9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee\": rpc error: code = NotFound desc = could not find container \"9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee\": container with ID starting with 9335e26e5334fd57f90ea438fbbb564ec2611ad75fe24dbee8759015a015abee not found: ID does not exist" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.826114 4926 scope.go:117] "RemoveContainer" containerID="fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b" Mar 12 18:22:46 crc kubenswrapper[4926]: E0312 18:22:46.826635 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b\": container with ID starting with fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b not found: ID does not exist" containerID="fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.826767 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b"} err="failed to get container status \"fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b\": rpc error: code = NotFound desc = could not find container \"fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b\": container with ID starting with fa78af8d923d704df31a407f7e84a2ed31c097099bad226b3668d09e28bca79b not found: ID does not exist" Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.934032 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tvrln"] Mar 12 18:22:46 crc kubenswrapper[4926]: I0312 18:22:46.953614 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-tvrln"] Mar 12 18:22:47 crc kubenswrapper[4926]: I0312 18:22:47.280897 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-749fd9494c-gtzjf"] Mar 12 18:22:47 crc kubenswrapper[4926]: W0312 18:22:47.283201 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04df6e00_4552_471c_9ae0_f45362e7e2b4.slice/crio-2d38489365b71e942082167910ea77ad7b51873f3aed248a2bfa421546b7758e WatchSource:0}: Error finding container 2d38489365b71e942082167910ea77ad7b51873f3aed248a2bfa421546b7758e: Status 404 returned error can't find the container with id 2d38489365b71e942082167910ea77ad7b51873f3aed248a2bfa421546b7758e Mar 12 18:22:47 crc kubenswrapper[4926]: I0312 18:22:47.598988 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2900c0f0-0ed1-493a-8bad-ed203682251d","Type":"ContainerStarted","Data":"e192a80af72131c894390e82b963179cefd9639c84c7a61aeb7db9b8d5176057"} Mar 12 18:22:47 crc kubenswrapper[4926]: I0312 18:22:47.610879 4926 generic.go:334] "Generic (PLEG): container finished" podID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerID="d817244fbaa6cac20d7a51e9c98e06316cbeae770e3def99741dd49bb46b1970" exitCode=0 Mar 12 18:22:47 crc kubenswrapper[4926]: I0312 18:22:47.611078 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" event={"ID":"af5b15ef-fa73-4ef4-9235-41da81503d2c","Type":"ContainerDied","Data":"d817244fbaa6cac20d7a51e9c98e06316cbeae770e3def99741dd49bb46b1970"} Mar 12 18:22:47 crc kubenswrapper[4926]: I0312 18:22:47.619713 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749fd9494c-gtzjf" event={"ID":"04df6e00-4552-471c-9ae0-f45362e7e2b4","Type":"ContainerStarted","Data":"2d38489365b71e942082167910ea77ad7b51873f3aed248a2bfa421546b7758e"} Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.503843 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" path="/var/lib/kubelet/pods/69fa4317-bafb-462d-b6aa-ea07437277f1/volumes" Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.505071 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72142af-cd95-4044-92d7-ce173109afa7" path="/var/lib/kubelet/pods/c72142af-cd95-4044-92d7-ce173109afa7/volumes" Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.646895 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2900c0f0-0ed1-493a-8bad-ed203682251d","Type":"ContainerStarted","Data":"e3c316681f99a8d367b4d444d86939314f7068d8029528e7292763d77248f72b"} Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.647052 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-log" containerID="cri-o://e192a80af72131c894390e82b963179cefd9639c84c7a61aeb7db9b8d5176057" gracePeriod=30 Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.647512 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-httpd" containerID="cri-o://e3c316681f99a8d367b4d444d86939314f7068d8029528e7292763d77248f72b" gracePeriod=30 Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.656673 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" event={"ID":"af5b15ef-fa73-4ef4-9235-41da81503d2c","Type":"ContainerStarted","Data":"acee92af6092ee87ba774fbb0a9ef600de19a9fc8c8b93c392fa1761ce8a5481"} Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.657784 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.681623 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a40ca3d-194d-4baa-8417-3614c8aeef08","Type":"ContainerStarted","Data":"c5bf2f687495f9063b1f714865d39313a50f9f846c3751dc36bfcd3a2678bb59"} Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.681686 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a40ca3d-194d-4baa-8417-3614c8aeef08","Type":"ContainerStarted","Data":"d6e2011fce2ea9b854a62890216980791595e226fd3037e1e5d5033caf9aa264"} Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.681832 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-log" containerID="cri-o://d6e2011fce2ea9b854a62890216980791595e226fd3037e1e5d5033caf9aa264" gracePeriod=30 Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.682129 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-httpd" containerID="cri-o://c5bf2f687495f9063b1f714865d39313a50f9f846c3751dc36bfcd3a2678bb59" gracePeriod=30 Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.687118 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.687096939 podStartE2EDuration="4.687096939s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:48.665739305 +0000 UTC m=+1209.034365648" watchObservedRunningTime="2026-03-12 18:22:48.687096939 +0000 UTC m=+1209.055723272" Mar 12 18:22:48 crc kubenswrapper[4926]: I0312 18:22:48.695969 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" podStartSLOduration=4.695951354 podStartE2EDuration="4.695951354s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:48.687340587 +0000 UTC m=+1209.055966920" watchObservedRunningTime="2026-03-12 18:22:48.695951354 +0000 UTC m=+1209.064577677" Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.694772 4926 generic.go:334] "Generic (PLEG): container finished" podID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerID="e3c316681f99a8d367b4d444d86939314f7068d8029528e7292763d77248f72b" exitCode=0 Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.695095 4926 generic.go:334] "Generic (PLEG): container finished" podID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerID="e192a80af72131c894390e82b963179cefd9639c84c7a61aeb7db9b8d5176057" exitCode=143 Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.695133 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2900c0f0-0ed1-493a-8bad-ed203682251d","Type":"ContainerDied","Data":"e3c316681f99a8d367b4d444d86939314f7068d8029528e7292763d77248f72b"} Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.695158 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2900c0f0-0ed1-493a-8bad-ed203682251d","Type":"ContainerDied","Data":"e192a80af72131c894390e82b963179cefd9639c84c7a61aeb7db9b8d5176057"} Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.697702 4926 generic.go:334] "Generic (PLEG): container finished" podID="fdce3e2c-d108-4904-b4ab-8d95d4838bad" containerID="334e13044642c4b0a613f2bfccdc8e4e0067f4f734093e62a567052168539e3c" exitCode=0 Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.697773 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrv24" event={"ID":"fdce3e2c-d108-4904-b4ab-8d95d4838bad","Type":"ContainerDied","Data":"334e13044642c4b0a613f2bfccdc8e4e0067f4f734093e62a567052168539e3c"} Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.701049 4926 generic.go:334] "Generic (PLEG): container finished" podID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerID="c5bf2f687495f9063b1f714865d39313a50f9f846c3751dc36bfcd3a2678bb59" exitCode=143 Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.701071 4926 generic.go:334] "Generic (PLEG): container finished" podID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerID="d6e2011fce2ea9b854a62890216980791595e226fd3037e1e5d5033caf9aa264" exitCode=143 Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.701137 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a40ca3d-194d-4baa-8417-3614c8aeef08","Type":"ContainerDied","Data":"c5bf2f687495f9063b1f714865d39313a50f9f846c3751dc36bfcd3a2678bb59"} Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.701194 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a40ca3d-194d-4baa-8417-3614c8aeef08","Type":"ContainerDied","Data":"d6e2011fce2ea9b854a62890216980791595e226fd3037e1e5d5033caf9aa264"} Mar 12 18:22:49 crc kubenswrapper[4926]: I0312 18:22:49.720619 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.720599672 podStartE2EDuration="5.720599672s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:22:48.719910269 +0000 UTC m=+1209.088536642" watchObservedRunningTime="2026-03-12 18:22:49.720599672 +0000 UTC m=+1210.089226005" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.858136 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84786596b5-28lbk"] Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.882138 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-89554fb64-s9c6q"] Mar 12 18:22:52 crc kubenswrapper[4926]: E0312 18:22:52.882481 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerName="dnsmasq-dns" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.882494 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerName="dnsmasq-dns" Mar 12 18:22:52 crc kubenswrapper[4926]: E0312 18:22:52.882509 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerName="init" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.882515 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerName="init" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.882675 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="69fa4317-bafb-462d-b6aa-ea07437277f1" containerName="dnsmasq-dns" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.883534 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.888759 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.905207 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-89554fb64-s9c6q"] Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958578 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p78br\" (UniqueName: \"kubernetes.io/projected/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-kube-api-access-p78br\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958644 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-scripts\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958668 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-config-data\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958698 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-secret-key\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958714 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-logs\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958729 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-combined-ca-bundle\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.958830 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-tls-certs\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:52 crc kubenswrapper[4926]: I0312 18:22:52.965978 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-749fd9494c-gtzjf"] Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.000778 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c6848d8cd-cq57n"] Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.002286 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.036000 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c6848d8cd-cq57n"] Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.060892 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-horizon-tls-certs\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.060945 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-scripts\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.060994 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-tls-certs\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061035 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmcq8\" (UniqueName: \"kubernetes.io/projected/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-kube-api-access-xmcq8\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061053 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-config-data\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061073 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p78br\" (UniqueName: \"kubernetes.io/projected/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-kube-api-access-p78br\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061102 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-combined-ca-bundle\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061127 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-logs\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061144 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-scripts\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061161 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-horizon-secret-key\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061182 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-config-data\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061209 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-secret-key\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061228 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-logs\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.061249 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-combined-ca-bundle\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.062233 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-logs\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.062654 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-scripts\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.063231 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-config-data\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.070695 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-combined-ca-bundle\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.072261 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-secret-key\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.077242 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-tls-certs\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.082590 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p78br\" (UniqueName: \"kubernetes.io/projected/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-kube-api-access-p78br\") pod \"horizon-89554fb64-s9c6q\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.162415 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmcq8\" (UniqueName: \"kubernetes.io/projected/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-kube-api-access-xmcq8\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.162962 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-config-data\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.163030 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-combined-ca-bundle\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.163092 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-logs\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.163112 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-horizon-secret-key\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.163247 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-horizon-tls-certs\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.163282 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-scripts\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.164209 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-logs\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.164277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-config-data\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.164462 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-scripts\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.178021 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-combined-ca-bundle\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.178535 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-horizon-tls-certs\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.179820 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-horizon-secret-key\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.181253 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmcq8\" (UniqueName: \"kubernetes.io/projected/a1ae8f23-3518-430a-bbcf-e7be0cb8282e-kube-api-access-xmcq8\") pod \"horizon-5c6848d8cd-cq57n\" (UID: \"a1ae8f23-3518-430a-bbcf-e7be0cb8282e\") " pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.216735 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:22:53 crc kubenswrapper[4926]: I0312 18:22:53.317499 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:22:55 crc kubenswrapper[4926]: I0312 18:22:55.070623 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:22:55 crc kubenswrapper[4926]: I0312 18:22:55.145349 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7vm4l"] Mar 12 18:22:55 crc kubenswrapper[4926]: I0312 18:22:55.145640 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-7vm4l" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" containerID="cri-o://633289debf3f6b1a2f62e3d378805f919d6393fc63c3e3559b1339530737acbe" gracePeriod=10 Mar 12 18:22:55 crc kubenswrapper[4926]: I0312 18:22:55.757147 4926 generic.go:334] "Generic (PLEG): container finished" podID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerID="633289debf3f6b1a2f62e3d378805f919d6393fc63c3e3559b1339530737acbe" exitCode=0 Mar 12 18:22:55 crc kubenswrapper[4926]: I0312 18:22:55.757323 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7vm4l" event={"ID":"f569f4cb-b487-41d8-bab4-5c2d7aba2219","Type":"ContainerDied","Data":"633289debf3f6b1a2f62e3d378805f919d6393fc63c3e3559b1339530737acbe"} Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.682988 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.698270 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771579 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-httpd-run\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771638 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-public-tls-certs\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771697 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-combined-ca-bundle\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771753 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-config-data\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771783 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-config-data\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771828 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-logs\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771856 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-internal-tls-certs\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771903 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn68d\" (UniqueName: \"kubernetes.io/projected/7a40ca3d-194d-4baa-8417-3614c8aeef08-kube-api-access-xn68d\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.771971 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772002 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-logs\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772054 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2njd\" (UniqueName: \"kubernetes.io/projected/2900c0f0-0ed1-493a-8bad-ed203682251d-kube-api-access-z2njd\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772080 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-scripts\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772110 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-combined-ca-bundle\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772133 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-scripts\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772160 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"2900c0f0-0ed1-493a-8bad-ed203682251d\" (UID: \"2900c0f0-0ed1-493a-8bad-ed203682251d\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772220 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-httpd-run\") pod \"7a40ca3d-194d-4baa-8417-3614c8aeef08\" (UID: \"7a40ca3d-194d-4baa-8417-3614c8aeef08\") " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772306 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772619 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-logs" (OuterVolumeSpecName: "logs") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772863 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772871 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.772897 4926 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.778580 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2900c0f0-0ed1-493a-8bad-ed203682251d-kube-api-access-z2njd" (OuterVolumeSpecName: "kube-api-access-z2njd") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "kube-api-access-z2njd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.778888 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-logs" (OuterVolumeSpecName: "logs") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.781793 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-scripts" (OuterVolumeSpecName: "scripts") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.781833 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.783724 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-scripts" (OuterVolumeSpecName: "scripts") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.786052 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a40ca3d-194d-4baa-8417-3614c8aeef08","Type":"ContainerDied","Data":"d093554ce00472559ced5d4b721fe022998a063b108c7d7113b402e4a98bca54"} Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.786080 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.786180 4926 scope.go:117] "RemoveContainer" containerID="c5bf2f687495f9063b1f714865d39313a50f9f846c3751dc36bfcd3a2678bb59" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.787638 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a40ca3d-194d-4baa-8417-3614c8aeef08-kube-api-access-xn68d" (OuterVolumeSpecName: "kube-api-access-xn68d") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "kube-api-access-xn68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.795044 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2900c0f0-0ed1-493a-8bad-ed203682251d","Type":"ContainerDied","Data":"7a86c1cac56bc2792ca8a01485262ffdadcde243619bf97536386064fe81ed59"} Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.795127 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.801021 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.812195 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.813699 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.832619 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-config-data" (OuterVolumeSpecName: "config-data") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.833470 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.833525 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-config-data" (OuterVolumeSpecName: "config-data") pod "7a40ca3d-194d-4baa-8417-3614c8aeef08" (UID: "7a40ca3d-194d-4baa-8417-3614c8aeef08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.834004 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2900c0f0-0ed1-493a-8bad-ed203682251d" (UID: "2900c0f0-0ed1-493a-8bad-ed203682251d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875206 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875239 4926 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875252 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn68d\" (UniqueName: \"kubernetes.io/projected/7a40ca3d-194d-4baa-8417-3614c8aeef08-kube-api-access-xn68d\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875285 4926 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875296 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2900c0f0-0ed1-493a-8bad-ed203682251d-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875307 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2njd\" (UniqueName: \"kubernetes.io/projected/2900c0f0-0ed1-493a-8bad-ed203682251d-kube-api-access-z2njd\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875315 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875324 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875332 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875343 4926 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875351 4926 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a40ca3d-194d-4baa-8417-3614c8aeef08-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875360 4926 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875368 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2900c0f0-0ed1-493a-8bad-ed203682251d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.875376 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a40ca3d-194d-4baa-8417-3614c8aeef08-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.894525 4926 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.895966 4926 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.976421 4926 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:57 crc kubenswrapper[4926]: I0312 18:22:57.976477 4926 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.019834 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7vm4l" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.135185 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.162009 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.182770 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.203515 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.209664 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: E0312 18:22:58.209968 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-log" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.209984 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-log" Mar 12 18:22:58 crc kubenswrapper[4926]: E0312 18:22:58.209995 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-log" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210003 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-log" Mar 12 18:22:58 crc kubenswrapper[4926]: E0312 18:22:58.210023 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-httpd" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210031 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-httpd" Mar 12 18:22:58 crc kubenswrapper[4926]: E0312 18:22:58.210041 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-httpd" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210047 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-httpd" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210204 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-log" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210218 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-httpd" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210227 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" containerName="glance-httpd" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.210244 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" containerName="glance-log" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.213219 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.227400 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.227779 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.237838 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2zfbp" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.238050 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.256917 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.258466 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.264493 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.265126 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.271257 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281535 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281595 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qgwd\" (UniqueName: \"kubernetes.io/projected/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-kube-api-access-7qgwd\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281642 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281687 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281725 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281751 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281786 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.281827 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.291906 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384721 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384772 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384801 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384834 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qgwd\" (UniqueName: \"kubernetes.io/projected/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-kube-api-access-7qgwd\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384876 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384924 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384957 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.384982 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385015 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385042 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385067 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-config-data\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385108 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385151 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-scripts\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385177 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2m5\" (UniqueName: \"kubernetes.io/projected/27561f0e-1da4-4313-a7df-544fdfc893b1-kube-api-access-9v2m5\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385217 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-logs\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.385250 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.387981 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.392522 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.393348 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.395779 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.396248 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.398274 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.399291 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.424602 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qgwd\" (UniqueName: \"kubernetes.io/projected/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-kube-api-access-7qgwd\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.431373 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.486837 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.486904 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.486923 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487004 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487023 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-config-data\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487061 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-scripts\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487081 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2m5\" (UniqueName: \"kubernetes.io/projected/27561f0e-1da4-4313-a7df-544fdfc893b1-kube-api-access-9v2m5\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487132 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-logs\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487234 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.487606 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-logs\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.488967 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.493943 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.499001 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-config-data\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.499683 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.501165 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2900c0f0-0ed1-493a-8bad-ed203682251d" path="/var/lib/kubelet/pods/2900c0f0-0ed1-493a-8bad-ed203682251d/volumes" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.502859 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a40ca3d-194d-4baa-8417-3614c8aeef08" path="/var/lib/kubelet/pods/7a40ca3d-194d-4baa-8417-3614c8aeef08/volumes" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.505635 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-scripts\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.509951 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2m5\" (UniqueName: \"kubernetes.io/projected/27561f0e-1da4-4313-a7df-544fdfc893b1-kube-api-access-9v2m5\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.518940 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " pod="openstack/glance-default-external-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.604030 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:22:58 crc kubenswrapper[4926]: I0312 18:22:58.656886 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:23:02 crc kubenswrapper[4926]: I0312 18:23:02.836144 4926 scope.go:117] "RemoveContainer" containerID="d82acd09ec6775228cff799be28fec22c80e7a284a7159b9a2b0821199a8fa17" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.003864 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.004051 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c5h548hbdhf8h556h5c5hd4h5c8h87h5ffh7bh59h694hf7h5ddh7bh5bhdfhdch657hd4h654h68fh65bh5b5h64h658h66dh577h8dhc8h5cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flm2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7786dc65fc-n6hp6_openstack(8c2a8323-527c-4e19-ab55-3c291c4d538f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.006349 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7786dc65fc-n6hp6" podUID="8c2a8323-527c-4e19-ab55-3c291c4d538f" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.019772 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7vm4l" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.020103 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.020302 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf4h5c4hcdh567h54ch555hdh97h696h645h89hc6h5bfh64h5cchdfh544hch596hf9h547h699h676h59bh89h55dh676h675h58bh555h5fch697q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8g599,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-749fd9494c-gtzjf_openstack(04df6e00-4552-471c-9ae0-f45362e7e2b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.026095 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.026227 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f5hb6h5bh564h58fh5b7h65ch689h644h5f6h8ch58dh679h599h7fh555h586h5d8h57dh5b9h66fh5d7h59fh8h678h9h6chb9h7ch58bh5d5h66bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7bxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-84786596b5-28lbk_openstack(1ca9ea6f-908c-48db-9313-c3ff4809a993): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.027607 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-749fd9494c-gtzjf" podUID="04df6e00-4552-471c-9ae0-f45362e7e2b4" Mar 12 18:23:03 crc kubenswrapper[4926]: E0312 18:23:03.028396 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-84786596b5-28lbk" podUID="1ca9ea6f-908c-48db-9313-c3ff4809a993" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.083407 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.106050 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-combined-ca-bundle\") pod \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.106112 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpnk7\" (UniqueName: \"kubernetes.io/projected/fdce3e2c-d108-4904-b4ab-8d95d4838bad-kube-api-access-rpnk7\") pod \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.106164 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-scripts\") pod \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.106233 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-config-data\") pod \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.106288 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-credential-keys\") pod \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.106361 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-fernet-keys\") pod \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\" (UID: \"fdce3e2c-d108-4904-b4ab-8d95d4838bad\") " Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.113708 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fdce3e2c-d108-4904-b4ab-8d95d4838bad" (UID: "fdce3e2c-d108-4904-b4ab-8d95d4838bad"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.114280 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdce3e2c-d108-4904-b4ab-8d95d4838bad-kube-api-access-rpnk7" (OuterVolumeSpecName: "kube-api-access-rpnk7") pod "fdce3e2c-d108-4904-b4ab-8d95d4838bad" (UID: "fdce3e2c-d108-4904-b4ab-8d95d4838bad"). InnerVolumeSpecName "kube-api-access-rpnk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.115285 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-scripts" (OuterVolumeSpecName: "scripts") pod "fdce3e2c-d108-4904-b4ab-8d95d4838bad" (UID: "fdce3e2c-d108-4904-b4ab-8d95d4838bad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.139274 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fdce3e2c-d108-4904-b4ab-8d95d4838bad" (UID: "fdce3e2c-d108-4904-b4ab-8d95d4838bad"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.143996 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-config-data" (OuterVolumeSpecName: "config-data") pod "fdce3e2c-d108-4904-b4ab-8d95d4838bad" (UID: "fdce3e2c-d108-4904-b4ab-8d95d4838bad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.150550 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdce3e2c-d108-4904-b4ab-8d95d4838bad" (UID: "fdce3e2c-d108-4904-b4ab-8d95d4838bad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.208037 4926 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.208065 4926 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.208075 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.208084 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpnk7\" (UniqueName: \"kubernetes.io/projected/fdce3e2c-d108-4904-b4ab-8d95d4838bad-kube-api-access-rpnk7\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.208094 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.208101 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdce3e2c-d108-4904-b4ab-8d95d4838bad-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.846168 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hrv24" Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.846029 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hrv24" event={"ID":"fdce3e2c-d108-4904-b4ab-8d95d4838bad","Type":"ContainerDied","Data":"cbd223739736371712c30899d1265bcdf31be415ef718a7391e85bb96801c5f5"} Mar 12 18:23:03 crc kubenswrapper[4926]: I0312 18:23:03.861091 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd223739736371712c30899d1265bcdf31be415ef718a7391e85bb96801c5f5" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.191736 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hrv24"] Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.193178 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hrv24"] Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.263724 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-98lfj"] Mar 12 18:23:04 crc kubenswrapper[4926]: E0312 18:23:04.264167 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdce3e2c-d108-4904-b4ab-8d95d4838bad" containerName="keystone-bootstrap" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.264190 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdce3e2c-d108-4904-b4ab-8d95d4838bad" containerName="keystone-bootstrap" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.264360 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdce3e2c-d108-4904-b4ab-8d95d4838bad" containerName="keystone-bootstrap" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.266353 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.269995 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vrsc7" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.270154 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.270267 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.270374 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.270913 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.281252 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-98lfj"] Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.335000 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-scripts\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.335052 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-fernet-keys\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.335076 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwc8\" (UniqueName: \"kubernetes.io/projected/af5704dd-cd13-4e5f-a77b-01266c63eeba-kube-api-access-zbwc8\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.335143 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-credential-keys\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.335203 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-combined-ca-bundle\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.335262 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-config-data\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.436286 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-credential-keys\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.436349 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-combined-ca-bundle\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.436396 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-config-data\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.436453 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-scripts\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.436477 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-fernet-keys\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.436493 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwc8\" (UniqueName: \"kubernetes.io/projected/af5704dd-cd13-4e5f-a77b-01266c63eeba-kube-api-access-zbwc8\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.441548 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-combined-ca-bundle\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.441616 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-config-data\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.441880 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-fernet-keys\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.442237 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-scripts\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.446766 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-credential-keys\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.451049 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwc8\" (UniqueName: \"kubernetes.io/projected/af5704dd-cd13-4e5f-a77b-01266c63eeba-kube-api-access-zbwc8\") pod \"keystone-bootstrap-98lfj\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.499618 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdce3e2c-d108-4904-b4ab-8d95d4838bad" path="/var/lib/kubelet/pods/fdce3e2c-d108-4904-b4ab-8d95d4838bad/volumes" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.601004 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.857499 4926 generic.go:334] "Generic (PLEG): container finished" podID="e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" containerID="d40939b357197123b7378525a59c3f91e7240cb30140873883a85a0672ad25de" exitCode=0 Mar 12 18:23:04 crc kubenswrapper[4926]: I0312 18:23:04.857540 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vc5cr" event={"ID":"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b","Type":"ContainerDied","Data":"d40939b357197123b7378525a59c3f91e7240cb30140873883a85a0672ad25de"} Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.594694 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.602088 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.610348 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.615168 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.637630 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.750320 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-config\") pod \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.750394 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ca9ea6f-908c-48db-9313-c3ff4809a993-horizon-secret-key\") pod \"1ca9ea6f-908c-48db-9313-c3ff4809a993\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.750421 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv46v\" (UniqueName: \"kubernetes.io/projected/f569f4cb-b487-41d8-bab4-5c2d7aba2219-kube-api-access-vv46v\") pod \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.750500 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df6kk\" (UniqueName: \"kubernetes.io/projected/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-kube-api-access-df6kk\") pod \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.750534 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-config-data\") pod \"8c2a8323-527c-4e19-ab55-3c291c4d538f\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.750610 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-combined-ca-bundle\") pod \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\" (UID: \"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751391 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-scripts\") pod \"1ca9ea6f-908c-48db-9313-c3ff4809a993\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751471 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-sb\") pod \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751505 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-nb\") pod \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751576 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-scripts\") pod \"04df6e00-4552-471c-9ae0-f45362e7e2b4\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751742 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7bxq\" (UniqueName: \"kubernetes.io/projected/1ca9ea6f-908c-48db-9313-c3ff4809a993-kube-api-access-p7bxq\") pod \"1ca9ea6f-908c-48db-9313-c3ff4809a993\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751809 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca9ea6f-908c-48db-9313-c3ff4809a993-logs\") pod \"1ca9ea6f-908c-48db-9313-c3ff4809a993\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.751842 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-config-data\") pod \"1ca9ea6f-908c-48db-9313-c3ff4809a993\" (UID: \"1ca9ea6f-908c-48db-9313-c3ff4809a993\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752081 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-config-data" (OuterVolumeSpecName: "config-data") pod "8c2a8323-527c-4e19-ab55-3c291c4d538f" (UID: "8c2a8323-527c-4e19-ab55-3c291c4d538f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752111 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-dns-svc\") pod \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752188 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c2a8323-527c-4e19-ab55-3c291c4d538f-logs\") pod \"8c2a8323-527c-4e19-ab55-3c291c4d538f\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752287 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-config\") pod \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\" (UID: \"f569f4cb-b487-41d8-bab4-5c2d7aba2219\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752331 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flm2j\" (UniqueName: \"kubernetes.io/projected/8c2a8323-527c-4e19-ab55-3c291c4d538f-kube-api-access-flm2j\") pod \"8c2a8323-527c-4e19-ab55-3c291c4d538f\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752355 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04df6e00-4552-471c-9ae0-f45362e7e2b4-logs\") pod \"04df6e00-4552-471c-9ae0-f45362e7e2b4\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752375 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g599\" (UniqueName: \"kubernetes.io/projected/04df6e00-4552-471c-9ae0-f45362e7e2b4-kube-api-access-8g599\") pod \"04df6e00-4552-471c-9ae0-f45362e7e2b4\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752656 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-scripts" (OuterVolumeSpecName: "scripts") pod "04df6e00-4552-471c-9ae0-f45362e7e2b4" (UID: "04df6e00-4552-471c-9ae0-f45362e7e2b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752755 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c2a8323-527c-4e19-ab55-3c291c4d538f-horizon-secret-key\") pod \"8c2a8323-527c-4e19-ab55-3c291c4d538f\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752795 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04df6e00-4552-471c-9ae0-f45362e7e2b4-horizon-secret-key\") pod \"04df6e00-4552-471c-9ae0-f45362e7e2b4\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752828 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-config-data\") pod \"04df6e00-4552-471c-9ae0-f45362e7e2b4\" (UID: \"04df6e00-4552-471c-9ae0-f45362e7e2b4\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.752852 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-scripts\") pod \"8c2a8323-527c-4e19-ab55-3c291c4d538f\" (UID: \"8c2a8323-527c-4e19-ab55-3c291c4d538f\") " Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.753083 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca9ea6f-908c-48db-9313-c3ff4809a993-logs" (OuterVolumeSpecName: "logs") pod "1ca9ea6f-908c-48db-9313-c3ff4809a993" (UID: "1ca9ea6f-908c-48db-9313-c3ff4809a993"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.753543 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-scripts" (OuterVolumeSpecName: "scripts") pod "8c2a8323-527c-4e19-ab55-3c291c4d538f" (UID: "8c2a8323-527c-4e19-ab55-3c291c4d538f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.753553 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-scripts" (OuterVolumeSpecName: "scripts") pod "1ca9ea6f-908c-48db-9313-c3ff4809a993" (UID: "1ca9ea6f-908c-48db-9313-c3ff4809a993"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.753865 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.754152 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.754239 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca9ea6f-908c-48db-9313-c3ff4809a993-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.754315 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2a8323-527c-4e19-ab55-3c291c4d538f-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.755822 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-kube-api-access-df6kk" (OuterVolumeSpecName: "kube-api-access-df6kk") pod "e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" (UID: "e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b"). InnerVolumeSpecName "kube-api-access-df6kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.756087 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2a8323-527c-4e19-ab55-3c291c4d538f-logs" (OuterVolumeSpecName: "logs") pod "8c2a8323-527c-4e19-ab55-3c291c4d538f" (UID: "8c2a8323-527c-4e19-ab55-3c291c4d538f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.756692 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca9ea6f-908c-48db-9313-c3ff4809a993-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1ca9ea6f-908c-48db-9313-c3ff4809a993" (UID: "1ca9ea6f-908c-48db-9313-c3ff4809a993"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.757284 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c2a8323-527c-4e19-ab55-3c291c4d538f-kube-api-access-flm2j" (OuterVolumeSpecName: "kube-api-access-flm2j") pod "8c2a8323-527c-4e19-ab55-3c291c4d538f" (UID: "8c2a8323-527c-4e19-ab55-3c291c4d538f"). InnerVolumeSpecName "kube-api-access-flm2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.757563 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04df6e00-4552-471c-9ae0-f45362e7e2b4-logs" (OuterVolumeSpecName: "logs") pod "04df6e00-4552-471c-9ae0-f45362e7e2b4" (UID: "04df6e00-4552-471c-9ae0-f45362e7e2b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.758035 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-config-data" (OuterVolumeSpecName: "config-data") pod "1ca9ea6f-908c-48db-9313-c3ff4809a993" (UID: "1ca9ea6f-908c-48db-9313-c3ff4809a993"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.760332 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-config-data" (OuterVolumeSpecName: "config-data") pod "04df6e00-4552-471c-9ae0-f45362e7e2b4" (UID: "04df6e00-4552-471c-9ae0-f45362e7e2b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.761075 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04df6e00-4552-471c-9ae0-f45362e7e2b4-kube-api-access-8g599" (OuterVolumeSpecName: "kube-api-access-8g599") pod "04df6e00-4552-471c-9ae0-f45362e7e2b4" (UID: "04df6e00-4552-471c-9ae0-f45362e7e2b4"). InnerVolumeSpecName "kube-api-access-8g599". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.761621 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c2a8323-527c-4e19-ab55-3c291c4d538f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8c2a8323-527c-4e19-ab55-3c291c4d538f" (UID: "8c2a8323-527c-4e19-ab55-3c291c4d538f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.764191 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f569f4cb-b487-41d8-bab4-5c2d7aba2219-kube-api-access-vv46v" (OuterVolumeSpecName: "kube-api-access-vv46v") pod "f569f4cb-b487-41d8-bab4-5c2d7aba2219" (UID: "f569f4cb-b487-41d8-bab4-5c2d7aba2219"). InnerVolumeSpecName "kube-api-access-vv46v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.764981 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04df6e00-4552-471c-9ae0-f45362e7e2b4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "04df6e00-4552-471c-9ae0-f45362e7e2b4" (UID: "04df6e00-4552-471c-9ae0-f45362e7e2b4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.780605 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca9ea6f-908c-48db-9313-c3ff4809a993-kube-api-access-p7bxq" (OuterVolumeSpecName: "kube-api-access-p7bxq") pod "1ca9ea6f-908c-48db-9313-c3ff4809a993" (UID: "1ca9ea6f-908c-48db-9313-c3ff4809a993"). InnerVolumeSpecName "kube-api-access-p7bxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.790524 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-config" (OuterVolumeSpecName: "config") pod "e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" (UID: "e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.792894 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" (UID: "e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.805475 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f569f4cb-b487-41d8-bab4-5c2d7aba2219" (UID: "f569f4cb-b487-41d8-bab4-5c2d7aba2219"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.812030 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-config" (OuterVolumeSpecName: "config") pod "f569f4cb-b487-41d8-bab4-5c2d7aba2219" (UID: "f569f4cb-b487-41d8-bab4-5c2d7aba2219"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.814147 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f569f4cb-b487-41d8-bab4-5c2d7aba2219" (UID: "f569f4cb-b487-41d8-bab4-5c2d7aba2219"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.817670 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f569f4cb-b487-41d8-bab4-5c2d7aba2219" (UID: "f569f4cb-b487-41d8-bab4-5c2d7aba2219"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860675 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df6kk\" (UniqueName: \"kubernetes.io/projected/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-kube-api-access-df6kk\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860721 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860730 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860739 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860749 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860760 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7bxq\" (UniqueName: \"kubernetes.io/projected/1ca9ea6f-908c-48db-9313-c3ff4809a993-kube-api-access-p7bxq\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860768 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ca9ea6f-908c-48db-9313-c3ff4809a993-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860776 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860784 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c2a8323-527c-4e19-ab55-3c291c4d538f-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860792 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f569f4cb-b487-41d8-bab4-5c2d7aba2219-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860800 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flm2j\" (UniqueName: \"kubernetes.io/projected/8c2a8323-527c-4e19-ab55-3c291c4d538f-kube-api-access-flm2j\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860808 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04df6e00-4552-471c-9ae0-f45362e7e2b4-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860817 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g599\" (UniqueName: \"kubernetes.io/projected/04df6e00-4552-471c-9ae0-f45362e7e2b4-kube-api-access-8g599\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860826 4926 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/04df6e00-4552-471c-9ae0-f45362e7e2b4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860834 4926 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c2a8323-527c-4e19-ab55-3c291c4d538f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860842 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04df6e00-4552-471c-9ae0-f45362e7e2b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860850 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860858 4926 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ca9ea6f-908c-48db-9313-c3ff4809a993-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.860866 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv46v\" (UniqueName: \"kubernetes.io/projected/f569f4cb-b487-41d8-bab4-5c2d7aba2219-kube-api-access-vv46v\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.905971 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84786596b5-28lbk" event={"ID":"1ca9ea6f-908c-48db-9313-c3ff4809a993","Type":"ContainerDied","Data":"87827b8a20d43f6415fa4987b8da56ced353164794c7d29438fb6ffa36394a39"} Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.906223 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84786596b5-28lbk" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.907290 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7786dc65fc-n6hp6" event={"ID":"8c2a8323-527c-4e19-ab55-3c291c4d538f","Type":"ContainerDied","Data":"1a2407e300165d17a9ee94f4b0ab357bc8d93ae3d9e96138519ce22a7544a84a"} Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.907416 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7786dc65fc-n6hp6" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.915394 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7vm4l" event={"ID":"f569f4cb-b487-41d8-bab4-5c2d7aba2219","Type":"ContainerDied","Data":"1a2ca21647a28c29a2f5caa87bf2be645c7eb58039451905ac1f8540e674ede0"} Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.915427 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7vm4l" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.917394 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749fd9494c-gtzjf" event={"ID":"04df6e00-4552-471c-9ae0-f45362e7e2b4","Type":"ContainerDied","Data":"2d38489365b71e942082167910ea77ad7b51873f3aed248a2bfa421546b7758e"} Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.917462 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749fd9494c-gtzjf" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.920923 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vc5cr" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.920927 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vc5cr" event={"ID":"e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b","Type":"ContainerDied","Data":"85be7b1fc6b389d0d4f2769e75e7678715b75c3e095cab19a3e0aad9bb5656b7"} Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.920966 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85be7b1fc6b389d0d4f2769e75e7678715b75c3e095cab19a3e0aad9bb5656b7" Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.968314 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7786dc65fc-n6hp6"] Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.984255 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7786dc65fc-n6hp6"] Mar 12 18:23:10 crc kubenswrapper[4926]: I0312 18:23:10.997929 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84786596b5-28lbk"] Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.005494 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84786596b5-28lbk"] Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.014598 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7vm4l"] Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.017930 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7vm4l"] Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.057821 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-749fd9494c-gtzjf"] Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.068015 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-749fd9494c-gtzjf"] Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.747719 4926 scope.go:117] "RemoveContainer" containerID="d6e2011fce2ea9b854a62890216980791595e226fd3037e1e5d5033caf9aa264" Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.764670 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.764954 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fgqfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lgvzs_openstack(dac4b5d6-fb31-4955-8679-db9d3ff63c10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.766202 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lgvzs" podUID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.886901 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rvkqm"] Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.890873 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="init" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.890992 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="init" Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.891097 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" containerName="neutron-db-sync" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.891191 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" containerName="neutron-db-sync" Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.891314 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.891409 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.891757 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" containerName="neutron-db-sync" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.891873 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.893061 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:11 crc kubenswrapper[4926]: I0312 18:23:11.915503 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rvkqm"] Mar 12 18:23:11 crc kubenswrapper[4926]: E0312 18:23:11.979023 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lgvzs" podUID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.023178 4926 scope.go:117] "RemoveContainer" containerID="e3c316681f99a8d367b4d444d86939314f7068d8029528e7292763d77248f72b" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.061387 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-884f7b65b-tpkzl"] Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.062858 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.069933 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.070270 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.070466 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qpbhw" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.070591 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.077992 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-884f7b65b-tpkzl"] Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.100403 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.100506 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwp6h\" (UniqueName: \"kubernetes.io/projected/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-kube-api-access-rwp6h\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.100614 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-svc\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.100656 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.100762 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-config\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.100833 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.191064 4926 scope.go:117] "RemoveContainer" containerID="e192a80af72131c894390e82b963179cefd9639c84c7a61aeb7db9b8d5176057" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.202959 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.209232 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.209801 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-ovndb-tls-certs\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.209918 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.209983 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwp6h\" (UniqueName: \"kubernetes.io/projected/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-kube-api-access-rwp6h\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210022 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-combined-ca-bundle\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210142 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-svc\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210160 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-httpd-config\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210182 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210209 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vz6v\" (UniqueName: \"kubernetes.io/projected/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-kube-api-access-7vz6v\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210230 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-config\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210322 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-config\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.210968 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-svc\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.212269 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.213110 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-config\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.214460 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.233334 4926 scope.go:117] "RemoveContainer" containerID="633289debf3f6b1a2f62e3d378805f919d6393fc63c3e3559b1339530737acbe" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.236445 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwp6h\" (UniqueName: \"kubernetes.io/projected/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-kube-api-access-rwp6h\") pod \"dnsmasq-dns-55f844cf75-rvkqm\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.238545 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.289740 4926 scope.go:117] "RemoveContainer" containerID="50524e634e7b1a90622cd86f4a3bdca7cea0ea5d1a412cda03a7f2f2cce56ce2" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.312204 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vz6v\" (UniqueName: \"kubernetes.io/projected/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-kube-api-access-7vz6v\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.312243 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-config\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.313547 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-ovndb-tls-certs\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.313648 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-combined-ca-bundle\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.313758 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-httpd-config\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.316336 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-config\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.316834 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-ovndb-tls-certs\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.333878 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-httpd-config\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.335294 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-combined-ca-bundle\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.342034 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vz6v\" (UniqueName: \"kubernetes.io/projected/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-kube-api-access-7vz6v\") pod \"neutron-884f7b65b-tpkzl\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.425432 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.526614 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04df6e00-4552-471c-9ae0-f45362e7e2b4" path="/var/lib/kubelet/pods/04df6e00-4552-471c-9ae0-f45362e7e2b4/volumes" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.527301 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca9ea6f-908c-48db-9313-c3ff4809a993" path="/var/lib/kubelet/pods/1ca9ea6f-908c-48db-9313-c3ff4809a993/volumes" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.528232 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2a8323-527c-4e19-ab55-3c291c4d538f" path="/var/lib/kubelet/pods/8c2a8323-527c-4e19-ab55-3c291c4d538f/volumes" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.528599 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" path="/var/lib/kubelet/pods/f569f4cb-b487-41d8-bab4-5c2d7aba2219/volumes" Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.706561 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c6848d8cd-cq57n"] Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.718739 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-89554fb64-s9c6q"] Mar 12 18:23:12 crc kubenswrapper[4926]: W0312 18:23:12.724664 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7d07aa_8c5e_49f3_8d85_4c5e9569c572.slice/crio-47a3a3a3dc751bc7ac0243d80b1ecc826d87bf09a948cad8eeaa84ccc31fd39c WatchSource:0}: Error finding container 47a3a3a3dc751bc7ac0243d80b1ecc826d87bf09a948cad8eeaa84ccc31fd39c: Status 404 returned error can't find the container with id 47a3a3a3dc751bc7ac0243d80b1ecc826d87bf09a948cad8eeaa84ccc31fd39c Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.770772 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-98lfj"] Mar 12 18:23:12 crc kubenswrapper[4926]: W0312 18:23:12.776253 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf5704dd_cd13_4e5f_a77b_01266c63eeba.slice/crio-45ccbdc85c6be068623b090cab069fbadeba37022e035780f13ac08ff104b1bc WatchSource:0}: Error finding container 45ccbdc85c6be068623b090cab069fbadeba37022e035780f13ac08ff104b1bc: Status 404 returned error can't find the container with id 45ccbdc85c6be068623b090cab069fbadeba37022e035780f13ac08ff104b1bc Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.987268 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerStarted","Data":"3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c"} Mar 12 18:23:12 crc kubenswrapper[4926]: I0312 18:23:12.994821 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sgnbr" event={"ID":"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6","Type":"ContainerStarted","Data":"d333e02f3e64bf137ca7da16020fb743166cb48ce41f070ab0d454452957be39"} Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:12.999110 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rvkqm"] Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.000111 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-98lfj" event={"ID":"af5704dd-cd13-4e5f-a77b-01266c63eeba","Type":"ContainerStarted","Data":"45ccbdc85c6be068623b090cab069fbadeba37022e035780f13ac08ff104b1bc"} Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.001019 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-txt96" event={"ID":"7a0f2830-bf50-4195-9dad-d4d2c9529ee9","Type":"ContainerStarted","Data":"5b07bfa58e0c68e62027f67884fd04fdf4f157c10b82889fbed4fe1cd06870d8"} Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.017211 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-sgnbr" podStartSLOduration=4.28196941 podStartE2EDuration="29.017192209s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="2026-03-12 18:22:45.724868182 +0000 UTC m=+1206.093494515" lastFinishedPulling="2026-03-12 18:23:10.460090981 +0000 UTC m=+1230.828717314" observedRunningTime="2026-03-12 18:23:13.017078925 +0000 UTC m=+1233.385705258" watchObservedRunningTime="2026-03-12 18:23:13.017192209 +0000 UTC m=+1233.385818542" Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.022273 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7vm4l" podUID="f569f4cb-b487-41d8-bab4-5c2d7aba2219" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.022529 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6848d8cd-cq57n" event={"ID":"a1ae8f23-3518-430a-bbcf-e7be0cb8282e","Type":"ContainerStarted","Data":"1d58f8674df4594ec9f33db698a8a40eaa76148cc32f01ed65517b4662c964ca"} Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.023829 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89554fb64-s9c6q" event={"ID":"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572","Type":"ContainerStarted","Data":"47a3a3a3dc751bc7ac0243d80b1ecc826d87bf09a948cad8eeaa84ccc31fd39c"} Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.033894 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-txt96" podStartSLOduration=4.356917418 podStartE2EDuration="29.033876433s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="2026-03-12 18:22:45.779577084 +0000 UTC m=+1206.148203417" lastFinishedPulling="2026-03-12 18:23:10.456536069 +0000 UTC m=+1230.825162432" observedRunningTime="2026-03-12 18:23:13.03187266 +0000 UTC m=+1233.400498993" watchObservedRunningTime="2026-03-12 18:23:13.033876433 +0000 UTC m=+1233.402502766" Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.067735 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.207636 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-884f7b65b-tpkzl"] Mar 12 18:23:13 crc kubenswrapper[4926]: W0312 18:23:13.211869 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice/crio-4598f84bb743a934c385d17beee4641ea28d3b613e4146bda41f42d7c8f987e3 WatchSource:0}: Error finding container 4598f84bb743a934c385d17beee4641ea28d3b613e4146bda41f42d7c8f987e3: Status 404 returned error can't find the container with id 4598f84bb743a934c385d17beee4641ea28d3b613e4146bda41f42d7c8f987e3 Mar 12 18:23:13 crc kubenswrapper[4926]: I0312 18:23:13.833118 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.034788 4926 generic.go:334] "Generic (PLEG): container finished" podID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerID="2f62316dc5afc9b2e89e828a9e035d936c5f1005f1c85a5b2bad886d39f3e886" exitCode=0 Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.034851 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" event={"ID":"b5cd3f65-2af4-4a38-ab87-c266452f8c5a","Type":"ContainerDied","Data":"2f62316dc5afc9b2e89e828a9e035d936c5f1005f1c85a5b2bad886d39f3e886"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.034875 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" event={"ID":"b5cd3f65-2af4-4a38-ab87-c266452f8c5a","Type":"ContainerStarted","Data":"a1cf427ff4b305faffde9b50b631a3a735bc0b296fb816c491a872c6626c0d60"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.037942 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-884f7b65b-tpkzl" event={"ID":"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea","Type":"ContainerStarted","Data":"e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.037975 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-884f7b65b-tpkzl" event={"ID":"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea","Type":"ContainerStarted","Data":"4598f84bb743a934c385d17beee4641ea28d3b613e4146bda41f42d7c8f987e3"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.043720 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"27561f0e-1da4-4313-a7df-544fdfc893b1","Type":"ContainerStarted","Data":"9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.043758 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"27561f0e-1da4-4313-a7df-544fdfc893b1","Type":"ContainerStarted","Data":"600cdc83756bdbe2006c6d01ba6698b38db80c94206722c7e9c45bd9b1332b71"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.048272 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6848d8cd-cq57n" event={"ID":"a1ae8f23-3518-430a-bbcf-e7be0cb8282e","Type":"ContainerStarted","Data":"0e660ab765557cad74f028b05adf2a2017733c5b7cad2a31d7ba9218f60c1b1a"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.052271 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89554fb64-s9c6q" event={"ID":"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572","Type":"ContainerStarted","Data":"a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.062521 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-98lfj" event={"ID":"af5704dd-cd13-4e5f-a77b-01266c63eeba","Type":"ContainerStarted","Data":"1b2fbc6d93b78d1252c9b5b62fe658eafa93cdb86d8acefaa37b868dbfa19698"} Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.319781 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-98lfj" podStartSLOduration=10.319763313 podStartE2EDuration="10.319763313s" podCreationTimestamp="2026-03-12 18:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:14.079274556 +0000 UTC m=+1234.447900899" watchObservedRunningTime="2026-03-12 18:23:14.319763313 +0000 UTC m=+1234.688389646" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.331385 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-65c5c86775-mct68"] Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.333348 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.336694 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.337225 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.353536 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65c5c86775-mct68"] Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.492956 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-httpd-config\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.493010 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-public-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.493122 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-config\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.493210 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-ovndb-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.493388 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-combined-ca-bundle\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.493431 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzgvh\" (UniqueName: \"kubernetes.io/projected/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-kube-api-access-nzgvh\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.493474 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-internal-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.600940 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-public-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.601107 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-config\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.601243 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-ovndb-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.601434 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-combined-ca-bundle\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.601512 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzgvh\" (UniqueName: \"kubernetes.io/projected/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-kube-api-access-nzgvh\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.601546 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-internal-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.601665 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-httpd-config\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.613464 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-internal-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.622762 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-config\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.624732 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-ovndb-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.628277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-httpd-config\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.631425 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-combined-ca-bundle\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.632229 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzgvh\" (UniqueName: \"kubernetes.io/projected/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-kube-api-access-nzgvh\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.635214 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-public-tls-certs\") pod \"neutron-65c5c86775-mct68\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:14 crc kubenswrapper[4926]: I0312 18:23:14.654929 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:15 crc kubenswrapper[4926]: I0312 18:23:15.100549 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-884f7b65b-tpkzl" event={"ID":"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea","Type":"ContainerStarted","Data":"d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529"} Mar 12 18:23:15 crc kubenswrapper[4926]: I0312 18:23:15.100959 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:15 crc kubenswrapper[4926]: I0312 18:23:15.102681 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a","Type":"ContainerStarted","Data":"ac524b6af20000079258195bab5b0ef6d7a0ae5f18a8830578a4f4e1b53a9c85"} Mar 12 18:23:15 crc kubenswrapper[4926]: I0312 18:23:15.123938 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-884f7b65b-tpkzl" podStartSLOduration=3.123920696 podStartE2EDuration="3.123920696s" podCreationTimestamp="2026-03-12 18:23:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:15.121364786 +0000 UTC m=+1235.489991119" watchObservedRunningTime="2026-03-12 18:23:15.123920696 +0000 UTC m=+1235.492547029" Mar 12 18:23:15 crc kubenswrapper[4926]: I0312 18:23:15.324950 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-65c5c86775-mct68"] Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.123430 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6848d8cd-cq57n" event={"ID":"a1ae8f23-3518-430a-bbcf-e7be0cb8282e","Type":"ContainerStarted","Data":"100a461384abbd26a4bc41fefac81726e40c151486f6992479e0630f10e31127"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.142674 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a","Type":"ContainerStarted","Data":"73938b8cf4a571a4406931a492f2e033c549c10b7a2bcdf0ca6b902ede59649d"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.142713 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a","Type":"ContainerStarted","Data":"809f7347552b8f928619d96194c02f26e8cc98b9b07e8efde999fc4f99c7437f"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.150372 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c6848d8cd-cq57n" podStartSLOduration=23.530879487 podStartE2EDuration="24.150350544s" podCreationTimestamp="2026-03-12 18:22:52 +0000 UTC" firstStartedPulling="2026-03-12 18:23:12.733682954 +0000 UTC m=+1233.102309297" lastFinishedPulling="2026-03-12 18:23:13.353154021 +0000 UTC m=+1233.721780354" observedRunningTime="2026-03-12 18:23:16.146633698 +0000 UTC m=+1236.515260041" watchObservedRunningTime="2026-03-12 18:23:16.150350544 +0000 UTC m=+1236.518976877" Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.152740 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65c5c86775-mct68" event={"ID":"2beed02e-2edf-4b52-8ea6-ae2dae7502d8","Type":"ContainerStarted","Data":"d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.152782 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65c5c86775-mct68" event={"ID":"2beed02e-2edf-4b52-8ea6-ae2dae7502d8","Type":"ContainerStarted","Data":"94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.152793 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65c5c86775-mct68" event={"ID":"2beed02e-2edf-4b52-8ea6-ae2dae7502d8","Type":"ContainerStarted","Data":"7b5dc2f13fd79b24bbd53ad5c8577f6fcd9167f1b3fea4e0a8139bb248f187ac"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.153662 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.159581 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerStarted","Data":"aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.163701 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89554fb64-s9c6q" event={"ID":"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572","Type":"ContainerStarted","Data":"2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.170930 4926 generic.go:334] "Generic (PLEG): container finished" podID="7a0f2830-bf50-4195-9dad-d4d2c9529ee9" containerID="5b07bfa58e0c68e62027f67884fd04fdf4f157c10b82889fbed4fe1cd06870d8" exitCode=0 Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.173690 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-txt96" event={"ID":"7a0f2830-bf50-4195-9dad-d4d2c9529ee9","Type":"ContainerDied","Data":"5b07bfa58e0c68e62027f67884fd04fdf4f157c10b82889fbed4fe1cd06870d8"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.186636 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=18.186622092 podStartE2EDuration="18.186622092s" podCreationTimestamp="2026-03-12 18:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:16.16296131 +0000 UTC m=+1236.531587663" watchObservedRunningTime="2026-03-12 18:23:16.186622092 +0000 UTC m=+1236.555248425" Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.187225 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" event={"ID":"b5cd3f65-2af4-4a38-ab87-c266452f8c5a","Type":"ContainerStarted","Data":"3d75e203f5f24817d9aa7e9a6cf9f743a4b6a96bd884d103fc15a23a410323b0"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.188118 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.194988 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-65c5c86775-mct68" podStartSLOduration=2.194973494 podStartE2EDuration="2.194973494s" podCreationTimestamp="2026-03-12 18:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:16.185804577 +0000 UTC m=+1236.554430910" watchObservedRunningTime="2026-03-12 18:23:16.194973494 +0000 UTC m=+1236.563599827" Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.201684 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"27561f0e-1da4-4313-a7df-544fdfc893b1","Type":"ContainerStarted","Data":"1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178"} Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.236674 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-89554fb64-s9c6q" podStartSLOduration=23.606097966 podStartE2EDuration="24.236656442s" podCreationTimestamp="2026-03-12 18:22:52 +0000 UTC" firstStartedPulling="2026-03-12 18:23:12.729275665 +0000 UTC m=+1233.097901998" lastFinishedPulling="2026-03-12 18:23:13.359834141 +0000 UTC m=+1233.728460474" observedRunningTime="2026-03-12 18:23:16.234067071 +0000 UTC m=+1236.602693404" watchObservedRunningTime="2026-03-12 18:23:16.236656442 +0000 UTC m=+1236.605282775" Mar 12 18:23:16 crc kubenswrapper[4926]: I0312 18:23:16.289855 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" podStartSLOduration=5.289838821 podStartE2EDuration="5.289838821s" podCreationTimestamp="2026-03-12 18:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:16.285626269 +0000 UTC m=+1236.654252622" watchObservedRunningTime="2026-03-12 18:23:16.289838821 +0000 UTC m=+1236.658465154" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.211697 4926 generic.go:334] "Generic (PLEG): container finished" podID="e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" containerID="d333e02f3e64bf137ca7da16020fb743166cb48ce41f070ab0d454452957be39" exitCode=0 Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.211794 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sgnbr" event={"ID":"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6","Type":"ContainerDied","Data":"d333e02f3e64bf137ca7da16020fb743166cb48ce41f070ab0d454452957be39"} Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.213626 4926 generic.go:334] "Generic (PLEG): container finished" podID="af5704dd-cd13-4e5f-a77b-01266c63eeba" containerID="1b2fbc6d93b78d1252c9b5b62fe658eafa93cdb86d8acefaa37b868dbfa19698" exitCode=0 Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.214625 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-98lfj" event={"ID":"af5704dd-cd13-4e5f-a77b-01266c63eeba","Type":"ContainerDied","Data":"1b2fbc6d93b78d1252c9b5b62fe658eafa93cdb86d8acefaa37b868dbfa19698"} Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.238886 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.23886802 podStartE2EDuration="19.23886802s" podCreationTimestamp="2026-03-12 18:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:16.309472537 +0000 UTC m=+1236.678098890" watchObservedRunningTime="2026-03-12 18:23:17.23886802 +0000 UTC m=+1237.607494353" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.684613 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-txt96" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.789416 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-combined-ca-bundle\") pod \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.789505 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-scripts\") pod \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.789544 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-logs\") pod \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.789596 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgd7q\" (UniqueName: \"kubernetes.io/projected/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-kube-api-access-tgd7q\") pod \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.789715 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-config-data\") pod \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\" (UID: \"7a0f2830-bf50-4195-9dad-d4d2c9529ee9\") " Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.790402 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-logs" (OuterVolumeSpecName: "logs") pod "7a0f2830-bf50-4195-9dad-d4d2c9529ee9" (UID: "7a0f2830-bf50-4195-9dad-d4d2c9529ee9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.794937 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-kube-api-access-tgd7q" (OuterVolumeSpecName: "kube-api-access-tgd7q") pod "7a0f2830-bf50-4195-9dad-d4d2c9529ee9" (UID: "7a0f2830-bf50-4195-9dad-d4d2c9529ee9"). InnerVolumeSpecName "kube-api-access-tgd7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.795092 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-scripts" (OuterVolumeSpecName: "scripts") pod "7a0f2830-bf50-4195-9dad-d4d2c9529ee9" (UID: "7a0f2830-bf50-4195-9dad-d4d2c9529ee9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.814931 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a0f2830-bf50-4195-9dad-d4d2c9529ee9" (UID: "7a0f2830-bf50-4195-9dad-d4d2c9529ee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.815809 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-config-data" (OuterVolumeSpecName: "config-data") pod "7a0f2830-bf50-4195-9dad-d4d2c9529ee9" (UID: "7a0f2830-bf50-4195-9dad-d4d2c9529ee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.891983 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgd7q\" (UniqueName: \"kubernetes.io/projected/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-kube-api-access-tgd7q\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.892025 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.892036 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.892045 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:17 crc kubenswrapper[4926]: I0312 18:23:17.892055 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a0f2830-bf50-4195-9dad-d4d2c9529ee9-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.223931 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-txt96" event={"ID":"7a0f2830-bf50-4195-9dad-d4d2c9529ee9","Type":"ContainerDied","Data":"ce15d4b03307ac2a9a7f3c1d05eba2559878bf699c0483537cc9d838c8197125"} Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.224189 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce15d4b03307ac2a9a7f3c1d05eba2559878bf699c0483537cc9d838c8197125" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.224240 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-txt96" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.424584 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6fc994c476-fv9c9"] Mar 12 18:23:18 crc kubenswrapper[4926]: E0312 18:23:18.424996 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0f2830-bf50-4195-9dad-d4d2c9529ee9" containerName="placement-db-sync" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.425014 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0f2830-bf50-4195-9dad-d4d2c9529ee9" containerName="placement-db-sync" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.425263 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0f2830-bf50-4195-9dad-d4d2c9529ee9" containerName="placement-db-sync" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.426687 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.431549 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.431757 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.431895 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.432039 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.432298 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9lh8l" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.458210 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fc994c476-fv9c9"] Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502576 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-config-data\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502627 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-scripts\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502685 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-public-tls-certs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502732 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/450e1ecf-5ae7-48b5-b567-e530e254f673-logs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502753 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-internal-tls-certs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502846 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2nd\" (UniqueName: \"kubernetes.io/projected/450e1ecf-5ae7-48b5-b567-e530e254f673-kube-api-access-vh2nd\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.502982 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-combined-ca-bundle\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.608967 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/450e1ecf-5ae7-48b5-b567-e530e254f673-logs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.609034 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-internal-tls-certs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.609073 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2nd\" (UniqueName: \"kubernetes.io/projected/450e1ecf-5ae7-48b5-b567-e530e254f673-kube-api-access-vh2nd\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.609130 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-combined-ca-bundle\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.609220 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-config-data\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.609240 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-scripts\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.609292 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-public-tls-certs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.610757 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.611586 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/450e1ecf-5ae7-48b5-b567-e530e254f673-logs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.611914 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.619113 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-combined-ca-bundle\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.619645 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-public-tls-certs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.621382 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-scripts\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.627804 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-config-data\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.635516 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2nd\" (UniqueName: \"kubernetes.io/projected/450e1ecf-5ae7-48b5-b567-e530e254f673-kube-api-access-vh2nd\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.639683 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-internal-tls-certs\") pod \"placement-6fc994c476-fv9c9\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.662903 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.662941 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.678604 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.741004 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.745659 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.748810 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 18:23:18 crc kubenswrapper[4926]: I0312 18:23:18.764823 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:19 crc kubenswrapper[4926]: I0312 18:23:19.240301 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 18:23:19 crc kubenswrapper[4926]: I0312 18:23:19.240364 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:19 crc kubenswrapper[4926]: I0312 18:23:19.240376 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:19 crc kubenswrapper[4926]: I0312 18:23:19.240385 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 18:23:22 crc kubenswrapper[4926]: I0312 18:23:22.046011 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 18:23:22 crc kubenswrapper[4926]: I0312 18:23:22.243556 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:22 crc kubenswrapper[4926]: I0312 18:23:22.423172 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-7pqqh"] Mar 12 18:23:22 crc kubenswrapper[4926]: I0312 18:23:22.423391 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerName="dnsmasq-dns" containerID="cri-o://acee92af6092ee87ba774fbb0a9ef600de19a9fc8c8b93c392fa1761ce8a5481" gracePeriod=10 Mar 12 18:23:22 crc kubenswrapper[4926]: I0312 18:23:22.586217 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.058485 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.185225 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.217676 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.217727 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.295634 4926 generic.go:334] "Generic (PLEG): container finished" podID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerID="acee92af6092ee87ba774fbb0a9ef600de19a9fc8c8b93c392fa1761ce8a5481" exitCode=0 Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.296418 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" event={"ID":"af5b15ef-fa73-4ef4-9235-41da81503d2c","Type":"ContainerDied","Data":"acee92af6092ee87ba774fbb0a9ef600de19a9fc8c8b93c392fa1761ce8a5481"} Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.317917 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.319556 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.959953 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:23 crc kubenswrapper[4926]: I0312 18:23:23.971579 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.042698 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-combined-ca-bundle\") pod \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.042752 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-combined-ca-bundle\") pod \"af5704dd-cd13-4e5f-a77b-01266c63eeba\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.042806 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-db-sync-config-data\") pod \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.042849 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-scripts\") pod \"af5704dd-cd13-4e5f-a77b-01266c63eeba\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.042866 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-config-data\") pod \"af5704dd-cd13-4e5f-a77b-01266c63eeba\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.043047 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-fernet-keys\") pod \"af5704dd-cd13-4e5f-a77b-01266c63eeba\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.043074 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-credential-keys\") pod \"af5704dd-cd13-4e5f-a77b-01266c63eeba\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.043098 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwc8\" (UniqueName: \"kubernetes.io/projected/af5704dd-cd13-4e5f-a77b-01266c63eeba-kube-api-access-zbwc8\") pod \"af5704dd-cd13-4e5f-a77b-01266c63eeba\" (UID: \"af5704dd-cd13-4e5f-a77b-01266c63eeba\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.043117 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mklss\" (UniqueName: \"kubernetes.io/projected/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-kube-api-access-mklss\") pod \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\" (UID: \"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.048206 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-scripts" (OuterVolumeSpecName: "scripts") pod "af5704dd-cd13-4e5f-a77b-01266c63eeba" (UID: "af5704dd-cd13-4e5f-a77b-01266c63eeba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.061049 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "af5704dd-cd13-4e5f-a77b-01266c63eeba" (UID: "af5704dd-cd13-4e5f-a77b-01266c63eeba"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.076598 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-kube-api-access-mklss" (OuterVolumeSpecName: "kube-api-access-mklss") pod "e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" (UID: "e96fcb3d-2f9f-468d-bafa-060a9d1f1af6"). InnerVolumeSpecName "kube-api-access-mklss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.081860 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" (UID: "e96fcb3d-2f9f-468d-bafa-060a9d1f1af6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.100635 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "af5704dd-cd13-4e5f-a77b-01266c63eeba" (UID: "af5704dd-cd13-4e5f-a77b-01266c63eeba"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.101118 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5704dd-cd13-4e5f-a77b-01266c63eeba-kube-api-access-zbwc8" (OuterVolumeSpecName: "kube-api-access-zbwc8") pod "af5704dd-cd13-4e5f-a77b-01266c63eeba" (UID: "af5704dd-cd13-4e5f-a77b-01266c63eeba"). InnerVolumeSpecName "kube-api-access-zbwc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.101586 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" (UID: "e96fcb3d-2f9f-468d-bafa-060a9d1f1af6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.107059 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af5704dd-cd13-4e5f-a77b-01266c63eeba" (UID: "af5704dd-cd13-4e5f-a77b-01266c63eeba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.133221 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-config-data" (OuterVolumeSpecName: "config-data") pod "af5704dd-cd13-4e5f-a77b-01266c63eeba" (UID: "af5704dd-cd13-4e5f-a77b-01266c63eeba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.144963 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145037 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145569 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145644 4926 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145703 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145756 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145810 4926 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145861 4926 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af5704dd-cd13-4e5f-a77b-01266c63eeba-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145911 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwc8\" (UniqueName: \"kubernetes.io/projected/af5704dd-cd13-4e5f-a77b-01266c63eeba-kube-api-access-zbwc8\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.145969 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mklss\" (UniqueName: \"kubernetes.io/projected/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6-kube-api-access-mklss\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.246634 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-sb\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.246720 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.246751 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-svc\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.246792 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-swift-storage-0\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.246832 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzlwc\" (UniqueName: \"kubernetes.io/projected/af5b15ef-fa73-4ef4-9235-41da81503d2c-kube-api-access-gzlwc\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.246867 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-nb\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.260724 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5b15ef-fa73-4ef4-9235-41da81503d2c-kube-api-access-gzlwc" (OuterVolumeSpecName: "kube-api-access-gzlwc") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "kube-api-access-gzlwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.292889 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.307128 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.307142 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.323170 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.326054 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-sgnbr" event={"ID":"e96fcb3d-2f9f-468d-bafa-060a9d1f1af6","Type":"ContainerDied","Data":"89660f246e3c30bdbd43247de70865e18df164fd93b36ad1830555d68b2f5032"} Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.326093 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89660f246e3c30bdbd43247de70865e18df164fd93b36ad1830555d68b2f5032" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.326146 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-sgnbr" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.333983 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerStarted","Data":"a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b"} Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.336023 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-98lfj" event={"ID":"af5704dd-cd13-4e5f-a77b-01266c63eeba","Type":"ContainerDied","Data":"45ccbdc85c6be068623b090cab069fbadeba37022e035780f13ac08ff104b1bc"} Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.336057 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45ccbdc85c6be068623b090cab069fbadeba37022e035780f13ac08ff104b1bc" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.336115 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-98lfj" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348399 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config" (OuterVolumeSpecName: "config") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348526 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config\") pod \"af5b15ef-fa73-4ef4-9235-41da81503d2c\" (UID: \"af5b15ef-fa73-4ef4-9235-41da81503d2c\") " Mar 12 18:23:24 crc kubenswrapper[4926]: W0312 18:23:24.348657 4926 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/af5b15ef-fa73-4ef4-9235-41da81503d2c/volumes/kubernetes.io~configmap/config Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348672 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config" (OuterVolumeSpecName: "config") pod "af5b15ef-fa73-4ef4-9235-41da81503d2c" (UID: "af5b15ef-fa73-4ef4-9235-41da81503d2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348907 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348921 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-7pqqh" event={"ID":"af5b15ef-fa73-4ef4-9235-41da81503d2c","Type":"ContainerDied","Data":"c327e0d0fbf885a0bb6f202d846326d2682c20d15307d1ed071e6aa3068c47c0"} Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348965 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348981 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348991 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.348999 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.349002 4926 scope.go:117] "RemoveContainer" containerID="acee92af6092ee87ba774fbb0a9ef600de19a9fc8c8b93c392fa1761ce8a5481" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.349009 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/af5b15ef-fa73-4ef4-9235-41da81503d2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.349020 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzlwc\" (UniqueName: \"kubernetes.io/projected/af5b15ef-fa73-4ef4-9235-41da81503d2c-kube-api-access-gzlwc\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.404998 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-7pqqh"] Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.407314 4926 scope.go:117] "RemoveContainer" containerID="d817244fbaa6cac20d7a51e9c98e06316cbeae770e3def99741dd49bb46b1970" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.419557 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-7pqqh"] Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.500357 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" path="/var/lib/kubelet/pods/af5b15ef-fa73-4ef4-9235-41da81503d2c/volumes" Mar 12 18:23:24 crc kubenswrapper[4926]: I0312 18:23:24.515282 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fc994c476-fv9c9"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.085681 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b667c464b-fk8sc"] Mar 12 18:23:25 crc kubenswrapper[4926]: E0312 18:23:25.086971 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" containerName="barbican-db-sync" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087015 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" containerName="barbican-db-sync" Mar 12 18:23:25 crc kubenswrapper[4926]: E0312 18:23:25.087051 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerName="dnsmasq-dns" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087061 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerName="dnsmasq-dns" Mar 12 18:23:25 crc kubenswrapper[4926]: E0312 18:23:25.087103 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerName="init" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087111 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerName="init" Mar 12 18:23:25 crc kubenswrapper[4926]: E0312 18:23:25.087125 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5704dd-cd13-4e5f-a77b-01266c63eeba" containerName="keystone-bootstrap" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087133 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5704dd-cd13-4e5f-a77b-01266c63eeba" containerName="keystone-bootstrap" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087357 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5b15ef-fa73-4ef4-9235-41da81503d2c" containerName="dnsmasq-dns" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087384 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5704dd-cd13-4e5f-a77b-01266c63eeba" containerName="keystone-bootstrap" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.087401 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" containerName="barbican-db-sync" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.088183 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.089927 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.092234 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.092237 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.092759 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vrsc7" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.093013 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.093098 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.105401 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b667c464b-fk8sc"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189403 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-scripts\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189488 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-combined-ca-bundle\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189509 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-internal-tls-certs\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189544 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-fernet-keys\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189634 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-public-tls-certs\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189793 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgrq\" (UniqueName: \"kubernetes.io/projected/054d27f0-5c9b-4e59-98b3-e05609c3b257-kube-api-access-bdgrq\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.189985 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-credential-keys\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.190032 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-config-data\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.233725 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-78dfff8dc9-fgs8s"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.235078 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.238204 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.238370 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hz28b" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.238600 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.293487 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-credential-keys\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.293723 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-config-data\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.293834 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-scripts\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.294037 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-combined-ca-bundle\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.294104 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-internal-tls-certs\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.294201 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-fernet-keys\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.294264 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-public-tls-certs\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.294364 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgrq\" (UniqueName: \"kubernetes.io/projected/054d27f0-5c9b-4e59-98b3-e05609c3b257-kube-api-access-bdgrq\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.306098 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-config-data\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.308914 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-public-tls-certs\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.309331 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-combined-ca-bundle\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.309353 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-credential-keys\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.312048 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-fernet-keys\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.312849 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-internal-tls-certs\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.318878 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/054d27f0-5c9b-4e59-98b3-e05609c3b257-scripts\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.352812 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-78dfff8dc9-fgs8s"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.354197 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgrq\" (UniqueName: \"kubernetes.io/projected/054d27f0-5c9b-4e59-98b3-e05609c3b257-kube-api-access-bdgrq\") pod \"keystone-5b667c464b-fk8sc\" (UID: \"054d27f0-5c9b-4e59-98b3-e05609c3b257\") " pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.396404 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.396472 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6qc2\" (UniqueName: \"kubernetes.io/projected/81afb8cd-af2c-4515-b4d1-893903371af0-kube-api-access-v6qc2\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.396530 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-combined-ca-bundle\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.396583 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81afb8cd-af2c-4515-b4d1-893903371af0-logs\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.396660 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data-custom\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.410345 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc994c476-fv9c9" event={"ID":"450e1ecf-5ae7-48b5-b567-e530e254f673","Type":"ContainerStarted","Data":"e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73"} Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.410387 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc994c476-fv9c9" event={"ID":"450e1ecf-5ae7-48b5-b567-e530e254f673","Type":"ContainerStarted","Data":"cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a"} Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.410396 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc994c476-fv9c9" event={"ID":"450e1ecf-5ae7-48b5-b567-e530e254f673","Type":"ContainerStarted","Data":"1f1ee7c53345b2a1183a515645bed6bfd508f8efad3f3b4a2588d4c7787c2e09"} Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.411398 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.411673 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.422750 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.456247 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-94956785d-mtl2w"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.475874 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.482369 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.491916 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gbmf8"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.500590 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.500626 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6qc2\" (UniqueName: \"kubernetes.io/projected/81afb8cd-af2c-4515-b4d1-893903371af0-kube-api-access-v6qc2\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.500687 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-combined-ca-bundle\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.500755 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81afb8cd-af2c-4515-b4d1-893903371af0-logs\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.500794 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data-custom\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.502730 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.504056 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81afb8cd-af2c-4515-b4d1-893903371af0-logs\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.514863 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.515340 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data-custom\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.515676 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-fddddf9f9-kbftb"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.526664 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-combined-ca-bundle\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.528597 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.537734 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6qc2\" (UniqueName: \"kubernetes.io/projected/81afb8cd-af2c-4515-b4d1-893903371af0-kube-api-access-v6qc2\") pod \"barbican-worker-78dfff8dc9-fgs8s\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.540630 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-94956785d-mtl2w"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.556904 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.595033 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fddddf9f9-kbftb"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.604355 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.604421 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ktzb\" (UniqueName: \"kubernetes.io/projected/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-kube-api-access-8ktzb\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.605928 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-config-data\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.605967 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-config-data-custom\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.605989 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606021 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da936793-13b1-4815-a1ec-4d5d609ca5e3-logs\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606038 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m7fh\" (UniqueName: \"kubernetes.io/projected/da936793-13b1-4815-a1ec-4d5d609ca5e3-kube-api-access-2m7fh\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606080 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxtxm\" (UniqueName: \"kubernetes.io/projected/08abfe56-0e5c-4634-9a1a-488e2bbb587d-kube-api-access-pxtxm\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606099 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-combined-ca-bundle\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606122 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606145 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-config\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606180 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-combined-ca-bundle\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606199 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data-custom\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606243 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606262 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.606307 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-logs\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.640537 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gbmf8"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.679439 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6fc994c476-fv9c9" podStartSLOduration=7.679414865 podStartE2EDuration="7.679414865s" podCreationTimestamp="2026-03-12 18:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:25.640875686 +0000 UTC m=+1246.009502019" watchObservedRunningTime="2026-03-12 18:23:25.679414865 +0000 UTC m=+1246.048041198" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.680504 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-74668b896d-mctls"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.681986 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709498 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ktzb\" (UniqueName: \"kubernetes.io/projected/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-kube-api-access-8ktzb\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709533 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-config-data\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709553 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-config-data-custom\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709574 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709601 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da936793-13b1-4815-a1ec-4d5d609ca5e3-logs\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709616 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m7fh\" (UniqueName: \"kubernetes.io/projected/da936793-13b1-4815-a1ec-4d5d609ca5e3-kube-api-access-2m7fh\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709640 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxtxm\" (UniqueName: \"kubernetes.io/projected/08abfe56-0e5c-4634-9a1a-488e2bbb587d-kube-api-access-pxtxm\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709658 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-combined-ca-bundle\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709677 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709700 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-config\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709728 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-combined-ca-bundle\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709749 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data-custom\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709776 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709794 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709830 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-logs\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.709855 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.710552 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.713090 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-config\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.713392 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da936793-13b1-4815-a1ec-4d5d609ca5e3-logs\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.717144 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.717910 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.723874 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-logs\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.734623 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-svc\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.739277 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-config-data\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.741494 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.741558 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74668b896d-mctls"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.742853 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data-custom\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.743188 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-combined-ca-bundle\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.743321 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-config-data-custom\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.750661 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxtxm\" (UniqueName: \"kubernetes.io/projected/08abfe56-0e5c-4634-9a1a-488e2bbb587d-kube-api-access-pxtxm\") pod \"dnsmasq-dns-85ff748b95-gbmf8\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.756469 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-combined-ca-bundle\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.762129 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ktzb\" (UniqueName: \"kubernetes.io/projected/08cef0ec-16bc-4b64-95f6-e0d8f22fa00e-kube-api-access-8ktzb\") pod \"barbican-worker-fddddf9f9-kbftb\" (UID: \"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e\") " pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.778598 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m7fh\" (UniqueName: \"kubernetes.io/projected/da936793-13b1-4815-a1ec-4d5d609ca5e3-kube-api-access-2m7fh\") pod \"barbican-keystone-listener-94956785d-mtl2w\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.811999 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-config-data-custom\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.812090 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-logs\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.812110 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfzr5\" (UniqueName: \"kubernetes.io/projected/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-kube-api-access-wfzr5\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.812131 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-combined-ca-bundle\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.812253 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-config-data\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.831852 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.872966 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9b55c586b-s7wqs"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.874458 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.892280 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.892810 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.908179 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-fddddf9f9-kbftb" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.914431 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-logs\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.914498 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfzr5\" (UniqueName: \"kubernetes.io/projected/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-kube-api-access-wfzr5\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.914524 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-combined-ca-bundle\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.914744 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-config-data\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.914843 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-config-data-custom\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.915351 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-logs\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.916932 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9b55c586b-s7wqs"] Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.931567 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-config-data\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.954413 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-combined-ca-bundle\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.982737 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfzr5\" (UniqueName: \"kubernetes.io/projected/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-kube-api-access-wfzr5\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:25 crc kubenswrapper[4926]: I0312 18:23:25.984957 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92-config-data-custom\") pod \"barbican-keystone-listener-74668b896d-mctls\" (UID: \"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92\") " pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.016748 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldmj\" (UniqueName: \"kubernetes.io/projected/2f1f2a42-878e-46c0-bd66-4927a4689299-kube-api-access-qldmj\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.016831 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1f2a42-878e-46c0-bd66-4927a4689299-logs\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.016881 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.017038 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-combined-ca-bundle\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.017064 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data-custom\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.019068 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74668b896d-mctls" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.122573 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-combined-ca-bundle\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.122651 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data-custom\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.122688 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldmj\" (UniqueName: \"kubernetes.io/projected/2f1f2a42-878e-46c0-bd66-4927a4689299-kube-api-access-qldmj\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.122728 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1f2a42-878e-46c0-bd66-4927a4689299-logs\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.122764 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.126995 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1f2a42-878e-46c0-bd66-4927a4689299-logs\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.138228 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.145968 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-combined-ca-bundle\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.149071 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldmj\" (UniqueName: \"kubernetes.io/projected/2f1f2a42-878e-46c0-bd66-4927a4689299-kube-api-access-qldmj\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.149553 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data-custom\") pod \"barbican-api-9b55c586b-s7wqs\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.224456 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.475927 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgvzs" event={"ID":"dac4b5d6-fb31-4955-8679-db9d3ff63c10","Type":"ContainerStarted","Data":"96adbf85f267b1793c500e8c152eb5c14d98f7080edf6f6f116872d3c431cb0a"} Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.501929 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lgvzs" podStartSLOduration=3.318637497 podStartE2EDuration="42.501911225s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="2026-03-12 18:22:45.744242434 +0000 UTC m=+1206.112868767" lastFinishedPulling="2026-03-12 18:23:24.927516162 +0000 UTC m=+1245.296142495" observedRunningTime="2026-03-12 18:23:26.496739203 +0000 UTC m=+1246.865365536" watchObservedRunningTime="2026-03-12 18:23:26.501911225 +0000 UTC m=+1246.870537558" Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.582195 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b667c464b-fk8sc"] Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.600199 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-78dfff8dc9-fgs8s"] Mar 12 18:23:26 crc kubenswrapper[4926]: W0312 18:23:26.621093 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod054d27f0_5c9b_4e59_98b3_e05609c3b257.slice/crio-f3c3a12240b2c31f1ffa837e04ed066879a72f603a656ed2b36d6c9e4a650f2f WatchSource:0}: Error finding container f3c3a12240b2c31f1ffa837e04ed066879a72f603a656ed2b36d6c9e4a650f2f: Status 404 returned error can't find the container with id f3c3a12240b2c31f1ffa837e04ed066879a72f603a656ed2b36d6c9e4a650f2f Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.736911 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-94956785d-mtl2w"] Mar 12 18:23:26 crc kubenswrapper[4926]: W0312 18:23:26.744761 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda936793_13b1_4815_a1ec_4d5d609ca5e3.slice/crio-01f48ef206a2d7f242f74c69621856e49171cd0d33097cfbec64d9c1935505bd WatchSource:0}: Error finding container 01f48ef206a2d7f242f74c69621856e49171cd0d33097cfbec64d9c1935505bd: Status 404 returned error can't find the container with id 01f48ef206a2d7f242f74c69621856e49171cd0d33097cfbec64d9c1935505bd Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.921781 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gbmf8"] Mar 12 18:23:26 crc kubenswrapper[4926]: I0312 18:23:26.935134 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74668b896d-mctls"] Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.099260 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9b55c586b-s7wqs"] Mar 12 18:23:27 crc kubenswrapper[4926]: W0312 18:23:27.118364 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1f2a42_878e_46c0_bd66_4927a4689299.slice/crio-28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc WatchSource:0}: Error finding container 28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc: Status 404 returned error can't find the container with id 28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.123723 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-fddddf9f9-kbftb"] Mar 12 18:23:27 crc kubenswrapper[4926]: W0312 18:23:27.130096 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08cef0ec_16bc_4b64_95f6_e0d8f22fa00e.slice/crio-c8e2f3a7bbe0bf79a104ee85f894f11f7f0fba578b28b0c57b6258b01319863e WatchSource:0}: Error finding container c8e2f3a7bbe0bf79a104ee85f894f11f7f0fba578b28b0c57b6258b01319863e: Status 404 returned error can't find the container with id c8e2f3a7bbe0bf79a104ee85f894f11f7f0fba578b28b0c57b6258b01319863e Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.487403 4926 generic.go:334] "Generic (PLEG): container finished" podID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerID="293a2572fcb8b2d40b187537e1c594705c232add90d5f330faf45d6e18e49dd0" exitCode=0 Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.487493 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" event={"ID":"08abfe56-0e5c-4634-9a1a-488e2bbb587d","Type":"ContainerDied","Data":"293a2572fcb8b2d40b187537e1c594705c232add90d5f330faf45d6e18e49dd0"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.487912 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" event={"ID":"08abfe56-0e5c-4634-9a1a-488e2bbb587d","Type":"ContainerStarted","Data":"07e4278fe40ae8695fb94d6a2d5bec78c86b69a3255ccbd19aa22a7865e54876"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.493962 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" event={"ID":"81afb8cd-af2c-4515-b4d1-893903371af0","Type":"ContainerStarted","Data":"3adbb909a2691a9c3475b5b8bcf70c23ddd0ca36e2352ef1830b72181f72ded0"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.496361 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b667c464b-fk8sc" event={"ID":"054d27f0-5c9b-4e59-98b3-e05609c3b257","Type":"ContainerStarted","Data":"272c18702618eb570c10d215f82e06e4aaeb5cf9a9534f622edc7dd3c1a2f82b"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.496398 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b667c464b-fk8sc" event={"ID":"054d27f0-5c9b-4e59-98b3-e05609c3b257","Type":"ContainerStarted","Data":"f3c3a12240b2c31f1ffa837e04ed066879a72f603a656ed2b36d6c9e4a650f2f"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.497410 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.509276 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74668b896d-mctls" event={"ID":"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92","Type":"ContainerStarted","Data":"7ab0fc4ee2052252ea1c5a664f1849480ecb5c7e93f1eac70fd366eabd6157fb"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.511048 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fddddf9f9-kbftb" event={"ID":"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e","Type":"ContainerStarted","Data":"c8e2f3a7bbe0bf79a104ee85f894f11f7f0fba578b28b0c57b6258b01319863e"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.514275 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" event={"ID":"da936793-13b1-4815-a1ec-4d5d609ca5e3","Type":"ContainerStarted","Data":"01f48ef206a2d7f242f74c69621856e49171cd0d33097cfbec64d9c1935505bd"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.519157 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9b55c586b-s7wqs" event={"ID":"2f1f2a42-878e-46c0-bd66-4927a4689299","Type":"ContainerStarted","Data":"0ae594d417f33a2af41173b2f1260c8120ad7232590b0279e7d83cb0dbe67a9c"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.519198 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9b55c586b-s7wqs" event={"ID":"2f1f2a42-878e-46c0-bd66-4927a4689299","Type":"ContainerStarted","Data":"28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc"} Mar 12 18:23:27 crc kubenswrapper[4926]: I0312 18:23:27.532790 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b667c464b-fk8sc" podStartSLOduration=2.532775262 podStartE2EDuration="2.532775262s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:27.5260103 +0000 UTC m=+1247.894636663" watchObservedRunningTime="2026-03-12 18:23:27.532775262 +0000 UTC m=+1247.901401595" Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.529513 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" event={"ID":"08abfe56-0e5c-4634-9a1a-488e2bbb587d","Type":"ContainerStarted","Data":"3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b"} Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.530982 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.533739 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9b55c586b-s7wqs" event={"ID":"2f1f2a42-878e-46c0-bd66-4927a4689299","Type":"ContainerStarted","Data":"8231f1ef7dfab85032bd8e8cc8a2434c72aeff5f8977a2307df4d52b313ce335"} Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.533777 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.533801 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.558189 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" podStartSLOduration=3.558164808 podStartE2EDuration="3.558164808s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:28.552063996 +0000 UTC m=+1248.920690349" watchObservedRunningTime="2026-03-12 18:23:28.558164808 +0000 UTC m=+1248.926791141" Mar 12 18:23:28 crc kubenswrapper[4926]: I0312 18:23:28.586692 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9b55c586b-s7wqs" podStartSLOduration=3.586674232 podStartE2EDuration="3.586674232s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:28.576955117 +0000 UTC m=+1248.945581460" watchObservedRunningTime="2026-03-12 18:23:28.586674232 +0000 UTC m=+1248.955300565" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.297702 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fd6447dc6-7dg85"] Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.299094 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.304604 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.305568 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.332607 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fd6447dc6-7dg85"] Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.402791 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-config-data\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.403346 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-combined-ca-bundle\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.403541 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-internal-tls-certs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.403624 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-public-tls-certs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.403685 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-config-data-custom\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.403745 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc7tt\" (UniqueName: \"kubernetes.io/projected/32458d94-2727-4718-a842-f20e68b6a0dd-kube-api-access-kc7tt\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.403821 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32458d94-2727-4718-a842-f20e68b6a0dd-logs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.505751 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-public-tls-certs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.505837 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-config-data-custom\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.505887 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc7tt\" (UniqueName: \"kubernetes.io/projected/32458d94-2727-4718-a842-f20e68b6a0dd-kube-api-access-kc7tt\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.505933 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32458d94-2727-4718-a842-f20e68b6a0dd-logs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.505988 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-config-data\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.506031 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-combined-ca-bundle\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.506085 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-internal-tls-certs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.507035 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32458d94-2727-4718-a842-f20e68b6a0dd-logs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.514247 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-combined-ca-bundle\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.529941 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-public-tls-certs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.531494 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-internal-tls-certs\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.533295 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-config-data\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.535722 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32458d94-2727-4718-a842-f20e68b6a0dd-config-data-custom\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.544482 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc7tt\" (UniqueName: \"kubernetes.io/projected/32458d94-2727-4718-a842-f20e68b6a0dd-kube-api-access-kc7tt\") pod \"barbican-api-6fd6447dc6-7dg85\" (UID: \"32458d94-2727-4718-a842-f20e68b6a0dd\") " pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:29 crc kubenswrapper[4926]: I0312 18:23:29.631663 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.450421 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fd6447dc6-7dg85"] Mar 12 18:23:30 crc kubenswrapper[4926]: W0312 18:23:30.466797 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32458d94_2727_4718_a842_f20e68b6a0dd.slice/crio-9c7e7e89198f2cf9ac44a3023e839fc1bfe908ca6727788ccbb5e057b4856f40 WatchSource:0}: Error finding container 9c7e7e89198f2cf9ac44a3023e839fc1bfe908ca6727788ccbb5e057b4856f40: Status 404 returned error can't find the container with id 9c7e7e89198f2cf9ac44a3023e839fc1bfe908ca6727788ccbb5e057b4856f40 Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.568647 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" event={"ID":"81afb8cd-af2c-4515-b4d1-893903371af0","Type":"ContainerStarted","Data":"300d112166ede447ec6708957fbc43526b8d0fa27de369019ba8b434782269d0"} Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.568700 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" event={"ID":"81afb8cd-af2c-4515-b4d1-893903371af0","Type":"ContainerStarted","Data":"bee09196dcaf75cb5eadfa250f0ca7febdf9ab076b98f039b2a754122b989aef"} Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.582147 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74668b896d-mctls" event={"ID":"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92","Type":"ContainerStarted","Data":"0c2e139904b5d1ac08e504463fb8ce4735df2887bdf04c680ede7cd5ab04f211"} Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.585194 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fddddf9f9-kbftb" event={"ID":"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e","Type":"ContainerStarted","Data":"d5a36d4293aef7b5d2a9d41d34927260eb58046868118cf247fcac4acce3f566"} Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.586196 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd6447dc6-7dg85" event={"ID":"32458d94-2727-4718-a842-f20e68b6a0dd","Type":"ContainerStarted","Data":"9c7e7e89198f2cf9ac44a3023e839fc1bfe908ca6727788ccbb5e057b4856f40"} Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.590069 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" event={"ID":"da936793-13b1-4815-a1ec-4d5d609ca5e3","Type":"ContainerStarted","Data":"38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2"} Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.631315 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-fddddf9f9-kbftb" podStartSLOduration=2.816225767 podStartE2EDuration="5.63129129s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="2026-03-12 18:23:27.140723961 +0000 UTC m=+1247.509350294" lastFinishedPulling="2026-03-12 18:23:29.955789484 +0000 UTC m=+1250.324415817" observedRunningTime="2026-03-12 18:23:30.611283002 +0000 UTC m=+1250.979909335" watchObservedRunningTime="2026-03-12 18:23:30.63129129 +0000 UTC m=+1250.999917623" Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.678167 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" podStartSLOduration=2.38227452 podStartE2EDuration="5.67814163s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="2026-03-12 18:23:26.661298566 +0000 UTC m=+1247.029924899" lastFinishedPulling="2026-03-12 18:23:29.957165676 +0000 UTC m=+1250.325792009" observedRunningTime="2026-03-12 18:23:30.637233397 +0000 UTC m=+1251.005859730" watchObservedRunningTime="2026-03-12 18:23:30.67814163 +0000 UTC m=+1251.046767963" Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.737066 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-78dfff8dc9-fgs8s"] Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.742710 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-74668b896d-mctls" podStartSLOduration=2.72742515 podStartE2EDuration="5.742692056s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="2026-03-12 18:23:26.938574827 +0000 UTC m=+1247.307201160" lastFinishedPulling="2026-03-12 18:23:29.953841733 +0000 UTC m=+1250.322468066" observedRunningTime="2026-03-12 18:23:30.677206981 +0000 UTC m=+1251.045833314" watchObservedRunningTime="2026-03-12 18:23:30.742692056 +0000 UTC m=+1251.111318389" Mar 12 18:23:30 crc kubenswrapper[4926]: I0312 18:23:30.756279 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-94956785d-mtl2w"] Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.604614 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" event={"ID":"da936793-13b1-4815-a1ec-4d5d609ca5e3","Type":"ContainerStarted","Data":"7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8"} Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.604696 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener-log" containerID="cri-o://38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2" gracePeriod=30 Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.604736 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener" containerID="cri-o://7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8" gracePeriod=30 Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.608998 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74668b896d-mctls" event={"ID":"4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92","Type":"ContainerStarted","Data":"c8ec8c9afa0fc5bd02807c600c4d8dde24f99e3b5f25ba0d76685c20d0c9ee08"} Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.616268 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-fddddf9f9-kbftb" event={"ID":"08cef0ec-16bc-4b64-95f6-e0d8f22fa00e","Type":"ContainerStarted","Data":"6993d35552e1e9c5edf87ca66043b892ec364d28d08d5907d5afa076e67ba45a"} Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.618480 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd6447dc6-7dg85" event={"ID":"32458d94-2727-4718-a842-f20e68b6a0dd","Type":"ContainerStarted","Data":"ee2e4bc6fdc6d3d511e5d9d18c4d7f78ce09e67e478b5dd24044bb375fb67600"} Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.618523 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fd6447dc6-7dg85" event={"ID":"32458d94-2727-4718-a842-f20e68b6a0dd","Type":"ContainerStarted","Data":"cedc169daa886f2a21e9492e6b54650af1b8208615b2c2bfa967597a97d9c542"} Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.646222 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" podStartSLOduration=3.441357733 podStartE2EDuration="6.646200567s" podCreationTimestamp="2026-03-12 18:23:25 +0000 UTC" firstStartedPulling="2026-03-12 18:23:26.751125005 +0000 UTC m=+1247.119751338" lastFinishedPulling="2026-03-12 18:23:29.955967839 +0000 UTC m=+1250.324594172" observedRunningTime="2026-03-12 18:23:31.621890404 +0000 UTC m=+1251.990516737" watchObservedRunningTime="2026-03-12 18:23:31.646200567 +0000 UTC m=+1252.014826900" Mar 12 18:23:31 crc kubenswrapper[4926]: I0312 18:23:31.650600 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fd6447dc6-7dg85" podStartSLOduration=2.650582665 podStartE2EDuration="2.650582665s" podCreationTimestamp="2026-03-12 18:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:31.645486204 +0000 UTC m=+1252.014112537" watchObservedRunningTime="2026-03-12 18:23:31.650582665 +0000 UTC m=+1252.019208988" Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.636521 4926 generic.go:334] "Generic (PLEG): container finished" podID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerID="38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2" exitCode=143 Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.636572 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" event={"ID":"da936793-13b1-4815-a1ec-4d5d609ca5e3","Type":"ContainerDied","Data":"38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2"} Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.638389 4926 generic.go:334] "Generic (PLEG): container finished" podID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" containerID="96adbf85f267b1793c500e8c152eb5c14d98f7080edf6f6f116872d3c431cb0a" exitCode=0 Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.638568 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgvzs" event={"ID":"dac4b5d6-fb31-4955-8679-db9d3ff63c10","Type":"ContainerDied","Data":"96adbf85f267b1793c500e8c152eb5c14d98f7080edf6f6f116872d3c431cb0a"} Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.638978 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.639016 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.639416 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker-log" containerID="cri-o://bee09196dcaf75cb5eadfa250f0ca7febdf9ab076b98f039b2a754122b989aef" gracePeriod=30 Mar 12 18:23:32 crc kubenswrapper[4926]: I0312 18:23:32.639523 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker" containerID="cri-o://300d112166ede447ec6708957fbc43526b8d0fa27de369019ba8b434782269d0" gracePeriod=30 Mar 12 18:23:33 crc kubenswrapper[4926]: I0312 18:23:33.320096 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5c6848d8cd-cq57n" podUID="a1ae8f23-3518-430a-bbcf-e7be0cb8282e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 12 18:23:33 crc kubenswrapper[4926]: I0312 18:23:33.633813 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 12 18:23:33 crc kubenswrapper[4926]: I0312 18:23:33.656871 4926 generic.go:334] "Generic (PLEG): container finished" podID="81afb8cd-af2c-4515-b4d1-893903371af0" containerID="300d112166ede447ec6708957fbc43526b8d0fa27de369019ba8b434782269d0" exitCode=0 Mar 12 18:23:33 crc kubenswrapper[4926]: I0312 18:23:33.656911 4926 generic.go:334] "Generic (PLEG): container finished" podID="81afb8cd-af2c-4515-b4d1-893903371af0" containerID="bee09196dcaf75cb5eadfa250f0ca7febdf9ab076b98f039b2a754122b989aef" exitCode=143 Mar 12 18:23:33 crc kubenswrapper[4926]: I0312 18:23:33.657052 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" event={"ID":"81afb8cd-af2c-4515-b4d1-893903371af0","Type":"ContainerDied","Data":"300d112166ede447ec6708957fbc43526b8d0fa27de369019ba8b434782269d0"} Mar 12 18:23:33 crc kubenswrapper[4926]: I0312 18:23:33.657185 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" event={"ID":"81afb8cd-af2c-4515-b4d1-893903371af0","Type":"ContainerDied","Data":"bee09196dcaf75cb5eadfa250f0ca7febdf9ab076b98f039b2a754122b989aef"} Mar 12 18:23:35 crc kubenswrapper[4926]: I0312 18:23:35.894676 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:35 crc kubenswrapper[4926]: I0312 18:23:35.977098 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rvkqm"] Mar 12 18:23:35 crc kubenswrapper[4926]: I0312 18:23:35.977348 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="dnsmasq-dns" containerID="cri-o://3d75e203f5f24817d9aa7e9a6cf9f743a4b6a96bd884d103fc15a23a410323b0" gracePeriod=10 Mar 12 18:23:36 crc kubenswrapper[4926]: I0312 18:23:36.688754 4926 generic.go:334] "Generic (PLEG): container finished" podID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerID="3d75e203f5f24817d9aa7e9a6cf9f743a4b6a96bd884d103fc15a23a410323b0" exitCode=0 Mar 12 18:23:36 crc kubenswrapper[4926]: I0312 18:23:36.688875 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" event={"ID":"b5cd3f65-2af4-4a38-ab87-c266452f8c5a","Type":"ContainerDied","Data":"3d75e203f5f24817d9aa7e9a6cf9f743a4b6a96bd884d103fc15a23a410323b0"} Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.240041 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.699480 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lgvzs" event={"ID":"dac4b5d6-fb31-4955-8679-db9d3ff63c10","Type":"ContainerDied","Data":"1b929dd828244692447588d25c4d9ac5aab4e983cee4fcf89d743277f69d1b5d"} Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.699727 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b929dd828244692447588d25c4d9ac5aab4e983cee4fcf89d743277f69d1b5d" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.716799 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.775039 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.863862 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.909949 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-combined-ca-bundle\") pod \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.910060 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-config-data\") pod \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.910100 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-scripts\") pod \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.910139 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgqfm\" (UniqueName: \"kubernetes.io/projected/dac4b5d6-fb31-4955-8679-db9d3ff63c10-kube-api-access-fgqfm\") pod \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.910249 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dac4b5d6-fb31-4955-8679-db9d3ff63c10-etc-machine-id\") pod \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.910347 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-db-sync-config-data\") pod \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\" (UID: \"dac4b5d6-fb31-4955-8679-db9d3ff63c10\") " Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.913768 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dac4b5d6-fb31-4955-8679-db9d3ff63c10-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dac4b5d6-fb31-4955-8679-db9d3ff63c10" (UID: "dac4b5d6-fb31-4955-8679-db9d3ff63c10"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.933104 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac4b5d6-fb31-4955-8679-db9d3ff63c10-kube-api-access-fgqfm" (OuterVolumeSpecName: "kube-api-access-fgqfm") pod "dac4b5d6-fb31-4955-8679-db9d3ff63c10" (UID: "dac4b5d6-fb31-4955-8679-db9d3ff63c10"). InnerVolumeSpecName "kube-api-access-fgqfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.933472 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-scripts" (OuterVolumeSpecName: "scripts") pod "dac4b5d6-fb31-4955-8679-db9d3ff63c10" (UID: "dac4b5d6-fb31-4955-8679-db9d3ff63c10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.933526 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dac4b5d6-fb31-4955-8679-db9d3ff63c10" (UID: "dac4b5d6-fb31-4955-8679-db9d3ff63c10"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:37 crc kubenswrapper[4926]: I0312 18:23:37.960042 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dac4b5d6-fb31-4955-8679-db9d3ff63c10" (UID: "dac4b5d6-fb31-4955-8679-db9d3ff63c10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:37.999208 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-config-data" (OuterVolumeSpecName: "config-data") pod "dac4b5d6-fb31-4955-8679-db9d3ff63c10" (UID: "dac4b5d6-fb31-4955-8679-db9d3ff63c10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.013207 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.013240 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.013249 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.013260 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgqfm\" (UniqueName: \"kubernetes.io/projected/dac4b5d6-fb31-4955-8679-db9d3ff63c10-kube-api-access-fgqfm\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.013270 4926 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dac4b5d6-fb31-4955-8679-db9d3ff63c10-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.013279 4926 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dac4b5d6-fb31-4955-8679-db9d3ff63c10-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:38 crc kubenswrapper[4926]: E0312 18:23:38.612509 4926 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddac4b5d6_fb31_4955_8679_db9d3ff63c10.slice/crio-1b929dd828244692447588d25c4d9ac5aab4e983cee4fcf89d743277f69d1b5d\": RecentStats: unable to find data in memory cache]" Mar 12 18:23:38 crc kubenswrapper[4926]: I0312 18:23:38.705908 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lgvzs" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.091687 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:39 crc kubenswrapper[4926]: E0312 18:23:39.092614 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" containerName="cinder-db-sync" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.092627 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" containerName="cinder-db-sync" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.092828 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" containerName="cinder-db-sync" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.093690 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.098820 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.099086 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m6xv4" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.099202 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.099703 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.165710 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.168929 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.182854 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-rln7t"] Mar 12 18:23:39 crc kubenswrapper[4926]: E0312 18:23:39.183259 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker-log" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.183271 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker-log" Mar 12 18:23:39 crc kubenswrapper[4926]: E0312 18:23:39.183313 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.183321 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.183495 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.183510 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" containerName="barbican-worker-log" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.184394 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.209754 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-rln7t"] Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.214365 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.241429 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6qc2\" (UniqueName: \"kubernetes.io/projected/81afb8cd-af2c-4515-b4d1-893903371af0-kube-api-access-v6qc2\") pod \"81afb8cd-af2c-4515-b4d1-893903371af0\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.241592 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81afb8cd-af2c-4515-b4d1-893903371af0-logs\") pod \"81afb8cd-af2c-4515-b4d1-893903371af0\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.241643 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data-custom\") pod \"81afb8cd-af2c-4515-b4d1-893903371af0\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.241768 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-combined-ca-bundle\") pod \"81afb8cd-af2c-4515-b4d1-893903371af0\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.241806 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data\") pod \"81afb8cd-af2c-4515-b4d1-893903371af0\" (UID: \"81afb8cd-af2c-4515-b4d1-893903371af0\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242103 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242128 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242198 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242222 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242251 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpn6w\" (UniqueName: \"kubernetes.io/projected/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-kube-api-access-jpn6w\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242279 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242344 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8xc\" (UniqueName: \"kubernetes.io/projected/75ab53a0-e811-4b31-8a1e-d71be4115b2c-kube-api-access-sm8xc\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242388 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242421 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-config\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242500 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242582 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.242605 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.243991 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81afb8cd-af2c-4515-b4d1-893903371af0-logs" (OuterVolumeSpecName: "logs") pod "81afb8cd-af2c-4515-b4d1-893903371af0" (UID: "81afb8cd-af2c-4515-b4d1-893903371af0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.261583 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "81afb8cd-af2c-4515-b4d1-893903371af0" (UID: "81afb8cd-af2c-4515-b4d1-893903371af0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.263845 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81afb8cd-af2c-4515-b4d1-893903371af0-kube-api-access-v6qc2" (OuterVolumeSpecName: "kube-api-access-v6qc2") pod "81afb8cd-af2c-4515-b4d1-893903371af0" (UID: "81afb8cd-af2c-4515-b4d1-893903371af0"). InnerVolumeSpecName "kube-api-access-v6qc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.318144 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:39 crc kubenswrapper[4926]: E0312 18:23:39.318562 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="dnsmasq-dns" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.318594 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="dnsmasq-dns" Mar 12 18:23:39 crc kubenswrapper[4926]: E0312 18:23:39.318619 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="init" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.318625 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="init" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.318788 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" containerName="dnsmasq-dns" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.319774 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.322311 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.333262 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81afb8cd-af2c-4515-b4d1-893903371af0" (UID: "81afb8cd-af2c-4515-b4d1-893903371af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.346552 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-nb\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.346735 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwp6h\" (UniqueName: \"kubernetes.io/projected/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-kube-api-access-rwp6h\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.346857 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.346899 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-svc\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.346920 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-sb\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.346945 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-config\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347130 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347181 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8xc\" (UniqueName: \"kubernetes.io/projected/75ab53a0-e811-4b31-8a1e-d71be4115b2c-kube-api-access-sm8xc\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347217 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347242 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-config\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347272 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347333 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347351 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347401 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347418 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347481 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347509 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347530 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpn6w\" (UniqueName: \"kubernetes.io/projected/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-kube-api-access-jpn6w\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347575 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347585 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6qc2\" (UniqueName: \"kubernetes.io/projected/81afb8cd-af2c-4515-b4d1-893903371af0-kube-api-access-v6qc2\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347596 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81afb8cd-af2c-4515-b4d1-893903371af0-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.347604 4926 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.350127 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.350961 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.351966 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-config\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.352017 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.353044 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.357401 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-kube-api-access-rwp6h" (OuterVolumeSpecName: "kube-api-access-rwp6h") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "kube-api-access-rwp6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.367785 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.374179 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.374491 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.376121 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.377810 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.380852 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-scripts\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.392999 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8xc\" (UniqueName: \"kubernetes.io/projected/75ab53a0-e811-4b31-8a1e-d71be4115b2c-kube-api-access-sm8xc\") pod \"dnsmasq-dns-5c9776ccc5-rln7t\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.404623 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpn6w\" (UniqueName: \"kubernetes.io/projected/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-kube-api-access-jpn6w\") pod \"cinder-scheduler-0\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.441367 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data" (OuterVolumeSpecName: "config-data") pod "81afb8cd-af2c-4515-b4d1-893903371af0" (UID: "81afb8cd-af2c-4515-b4d1-893903371af0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.448008 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.448022 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.448769 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0\") pod \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\" (UID: \"b5cd3f65-2af4-4a38-ab87-c266452f8c5a\") " Mar 12 18:23:39 crc kubenswrapper[4926]: W0312 18:23:39.448922 4926 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b5cd3f65-2af4-4a38-ab87-c266452f8c5a/volumes/kubernetes.io~configmap/dns-swift-storage-0 Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.448955 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449071 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data-custom\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449103 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449120 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-scripts\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449161 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg65z\" (UniqueName: \"kubernetes.io/projected/78cc2e75-39e6-4148-87a5-022cc3690da8-kube-api-access-jg65z\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449177 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78cc2e75-39e6-4148-87a5-022cc3690da8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449227 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.449298 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78cc2e75-39e6-4148-87a5-022cc3690da8-logs\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.450025 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.450057 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.450072 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81afb8cd-af2c-4515-b4d1-893903371af0-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.450084 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwp6h\" (UniqueName: \"kubernetes.io/projected/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-kube-api-access-rwp6h\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.453172 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-config" (OuterVolumeSpecName: "config") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.464318 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.469884 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5cd3f65-2af4-4a38-ab87-c266452f8c5a" (UID: "b5cd3f65-2af4-4a38-ab87-c266452f8c5a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.551951 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data-custom\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552002 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552021 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-scripts\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552065 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg65z\" (UniqueName: \"kubernetes.io/projected/78cc2e75-39e6-4148-87a5-022cc3690da8-kube-api-access-jg65z\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552084 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78cc2e75-39e6-4148-87a5-022cc3690da8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552123 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552157 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78cc2e75-39e6-4148-87a5-022cc3690da8-logs\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552329 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552342 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552351 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5cd3f65-2af4-4a38-ab87-c266452f8c5a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552523 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78cc2e75-39e6-4148-87a5-022cc3690da8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.552980 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78cc2e75-39e6-4148-87a5-022cc3690da8-logs\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.556328 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.556851 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.561931 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.563190 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data-custom\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.567954 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-scripts\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.571843 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg65z\" (UniqueName: \"kubernetes.io/projected/78cc2e75-39e6-4148-87a5-022cc3690da8-kube-api-access-jg65z\") pod \"cinder-api-0\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.621776 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.663855 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.771625 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerStarted","Data":"d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef"} Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.772073 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-central-agent" containerID="cri-o://3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c" gracePeriod=30 Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.772621 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.772885 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="proxy-httpd" containerID="cri-o://d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef" gracePeriod=30 Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.772931 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="sg-core" containerID="cri-o://a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b" gracePeriod=30 Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.772969 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-notification-agent" containerID="cri-o://aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84" gracePeriod=30 Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.814392 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" event={"ID":"b5cd3f65-2af4-4a38-ab87-c266452f8c5a","Type":"ContainerDied","Data":"a1cf427ff4b305faffde9b50b631a3a735bc0b296fb816c491a872c6626c0d60"} Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.814461 4926 scope.go:117] "RemoveContainer" containerID="3d75e203f5f24817d9aa7e9a6cf9f743a4b6a96bd884d103fc15a23a410323b0" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.814739 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-rvkqm" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.841032 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" event={"ID":"81afb8cd-af2c-4515-b4d1-893903371af0","Type":"ContainerDied","Data":"3adbb909a2691a9c3475b5b8bcf70c23ddd0ca36e2352ef1830b72181f72ded0"} Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.841148 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-78dfff8dc9-fgs8s" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.864074 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.725003928 podStartE2EDuration="55.864050804s" podCreationTimestamp="2026-03-12 18:22:44 +0000 UTC" firstStartedPulling="2026-03-12 18:22:45.823072656 +0000 UTC m=+1206.191698979" lastFinishedPulling="2026-03-12 18:23:38.962119522 +0000 UTC m=+1259.330745855" observedRunningTime="2026-03-12 18:23:39.815720347 +0000 UTC m=+1260.184346690" watchObservedRunningTime="2026-03-12 18:23:39.864050804 +0000 UTC m=+1260.232677137" Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.919502 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rvkqm"] Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.976067 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-rvkqm"] Mar 12 18:23:39 crc kubenswrapper[4926]: I0312 18:23:39.979679 4926 scope.go:117] "RemoveContainer" containerID="2f62316dc5afc9b2e89e828a9e035d936c5f1005f1c85a5b2bad886d39f3e886" Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.013730 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-78dfff8dc9-fgs8s"] Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.039249 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-78dfff8dc9-fgs8s"] Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.079734 4926 scope.go:117] "RemoveContainer" containerID="300d112166ede447ec6708957fbc43526b8d0fa27de369019ba8b434782269d0" Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.133851 4926 scope.go:117] "RemoveContainer" containerID="bee09196dcaf75cb5eadfa250f0ca7febdf9ab076b98f039b2a754122b989aef" Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.144471 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.384315 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-rln7t"] Mar 12 18:23:40 crc kubenswrapper[4926]: W0312 18:23:40.385384 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ab53a0_e811_4b31_8a1e_d71be4115b2c.slice/crio-2c7e6fa4e701c31af5cac5d50047d5137f377c93ee8e669dc7d858e4f08b2574 WatchSource:0}: Error finding container 2c7e6fa4e701c31af5cac5d50047d5137f377c93ee8e669dc7d858e4f08b2574: Status 404 returned error can't find the container with id 2c7e6fa4e701c31af5cac5d50047d5137f377c93ee8e669dc7d858e4f08b2574 Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.434571 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:40 crc kubenswrapper[4926]: W0312 18:23:40.447862 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cc2e75_39e6_4148_87a5_022cc3690da8.slice/crio-d2b1d21f05ea6cb158f8a2f7033f48e548508b5ca903be77119e981940c0caa4 WatchSource:0}: Error finding container d2b1d21f05ea6cb158f8a2f7033f48e548508b5ca903be77119e981940c0caa4: Status 404 returned error can't find the container with id d2b1d21f05ea6cb158f8a2f7033f48e548508b5ca903be77119e981940c0caa4 Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.507329 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81afb8cd-af2c-4515-b4d1-893903371af0" path="/var/lib/kubelet/pods/81afb8cd-af2c-4515-b4d1-893903371af0/volumes" Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.508193 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5cd3f65-2af4-4a38-ab87-c266452f8c5a" path="/var/lib/kubelet/pods/b5cd3f65-2af4-4a38-ab87-c266452f8c5a/volumes" Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.885121 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78cc2e75-39e6-4148-87a5-022cc3690da8","Type":"ContainerStarted","Data":"d2b1d21f05ea6cb158f8a2f7033f48e548508b5ca903be77119e981940c0caa4"} Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.887207 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" event={"ID":"75ab53a0-e811-4b31-8a1e-d71be4115b2c","Type":"ContainerStarted","Data":"ba45c947e1da77d586b46aef916a37581f794d00be599bae5b1f6a31185bfcd3"} Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.887237 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" event={"ID":"75ab53a0-e811-4b31-8a1e-d71be4115b2c","Type":"ContainerStarted","Data":"2c7e6fa4e701c31af5cac5d50047d5137f377c93ee8e669dc7d858e4f08b2574"} Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.901350 4926 generic.go:334] "Generic (PLEG): container finished" podID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerID="d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef" exitCode=0 Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.901383 4926 generic.go:334] "Generic (PLEG): container finished" podID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerID="a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b" exitCode=2 Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.901391 4926 generic.go:334] "Generic (PLEG): container finished" podID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerID="3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c" exitCode=0 Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.901429 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerDied","Data":"d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef"} Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.901467 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerDied","Data":"a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b"} Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.901477 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerDied","Data":"3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c"} Mar 12 18:23:40 crc kubenswrapper[4926]: I0312 18:23:40.919266 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c","Type":"ContainerStarted","Data":"b5b79d1f13639d906d19091084d8e1ea42e28328a4dc8a574f851692ee3f9f8c"} Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.241967 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.408349 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.584817 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fd6447dc6-7dg85" Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.642147 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9b55c586b-s7wqs"] Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.642354 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9b55c586b-s7wqs" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api-log" containerID="cri-o://0ae594d417f33a2af41173b2f1260c8120ad7232590b0279e7d83cb0dbe67a9c" gracePeriod=30 Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.642591 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9b55c586b-s7wqs" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api" containerID="cri-o://8231f1ef7dfab85032bd8e8cc8a2434c72aeff5f8977a2307df4d52b313ce335" gracePeriod=30 Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.958273 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c","Type":"ContainerStarted","Data":"5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc"} Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.983951 4926 generic.go:334] "Generic (PLEG): container finished" podID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerID="0ae594d417f33a2af41173b2f1260c8120ad7232590b0279e7d83cb0dbe67a9c" exitCode=143 Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.984053 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9b55c586b-s7wqs" event={"ID":"2f1f2a42-878e-46c0-bd66-4927a4689299","Type":"ContainerDied","Data":"0ae594d417f33a2af41173b2f1260c8120ad7232590b0279e7d83cb0dbe67a9c"} Mar 12 18:23:41 crc kubenswrapper[4926]: I0312 18:23:41.992131 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78cc2e75-39e6-4148-87a5-022cc3690da8","Type":"ContainerStarted","Data":"48ea775ffdf8bca7487b51cdb7c8d987f732c9702e1c95ad138a8eebbbab7c90"} Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.000964 4926 generic.go:334] "Generic (PLEG): container finished" podID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerID="ba45c947e1da77d586b46aef916a37581f794d00be599bae5b1f6a31185bfcd3" exitCode=0 Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.002642 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" event={"ID":"75ab53a0-e811-4b31-8a1e-d71be4115b2c","Type":"ContainerDied","Data":"ba45c947e1da77d586b46aef916a37581f794d00be599bae5b1f6a31185bfcd3"} Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.449550 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.773154 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-65c5c86775-mct68"] Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.773377 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-65c5c86775-mct68" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-api" containerID="cri-o://94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e" gracePeriod=30 Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.773978 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-65c5c86775-mct68" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-httpd" containerID="cri-o://d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217" gracePeriod=30 Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.789479 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7fc6496dbc-7qwrw"] Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.791015 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.808082 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fc6496dbc-7qwrw"] Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854327 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-httpd-config\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854422 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-ovndb-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854465 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-combined-ca-bundle\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854489 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-config\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854524 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-internal-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854557 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kxfv\" (UniqueName: \"kubernetes.io/projected/bf1502ef-50a4-45e0-b193-a6e25abccb32-kube-api-access-4kxfv\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.854610 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-public-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.926211 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-65c5c86775-mct68" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:49608->10.217.0.159:9696: read: connection reset by peer" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.956341 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-public-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.956675 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-httpd-config\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.956822 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-ovndb-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.956955 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-combined-ca-bundle\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.957075 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-config\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.957692 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-internal-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.957868 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kxfv\" (UniqueName: \"kubernetes.io/projected/bf1502ef-50a4-45e0-b193-a6e25abccb32-kube-api-access-4kxfv\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.961317 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-config\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.963240 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-internal-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.963953 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-httpd-config\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.964802 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-ovndb-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.965054 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-public-tls-certs\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.978125 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1502ef-50a4-45e0-b193-a6e25abccb32-combined-ca-bundle\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:42 crc kubenswrapper[4926]: I0312 18:23:42.990370 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kxfv\" (UniqueName: \"kubernetes.io/projected/bf1502ef-50a4-45e0-b193-a6e25abccb32-kube-api-access-4kxfv\") pod \"neutron-7fc6496dbc-7qwrw\" (UID: \"bf1502ef-50a4-45e0-b193-a6e25abccb32\") " pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.024686 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c","Type":"ContainerStarted","Data":"433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74"} Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.026581 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78cc2e75-39e6-4148-87a5-022cc3690da8","Type":"ContainerStarted","Data":"a12ed3e178db267afb04649b5718c1d2bcc9a78faa88e5e9519c5b32b47a1362"} Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.026711 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api" containerID="cri-o://a12ed3e178db267afb04649b5718c1d2bcc9a78faa88e5e9519c5b32b47a1362" gracePeriod=30 Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.026732 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.026711 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api-log" containerID="cri-o://48ea775ffdf8bca7487b51cdb7c8d987f732c9702e1c95ad138a8eebbbab7c90" gracePeriod=30 Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.029602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" event={"ID":"75ab53a0-e811-4b31-8a1e-d71be4115b2c","Type":"ContainerStarted","Data":"ad9f00f4a3b221ab329deab372c49c3816c4c0e0f8a0ad87e7aefa0a45c31c5b"} Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.029744 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.037213 4926 generic.go:334] "Generic (PLEG): container finished" podID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerID="d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217" exitCode=0 Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.037281 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65c5c86775-mct68" event={"ID":"2beed02e-2edf-4b52-8ea6-ae2dae7502d8","Type":"ContainerDied","Data":"d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217"} Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.044992 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.08737164 podStartE2EDuration="4.044977328s" podCreationTimestamp="2026-03-12 18:23:39 +0000 UTC" firstStartedPulling="2026-03-12 18:23:40.154063974 +0000 UTC m=+1260.522690307" lastFinishedPulling="2026-03-12 18:23:41.111669662 +0000 UTC m=+1261.480295995" observedRunningTime="2026-03-12 18:23:43.041335333 +0000 UTC m=+1263.409961666" watchObservedRunningTime="2026-03-12 18:23:43.044977328 +0000 UTC m=+1263.413603661" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.066600 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" podStartSLOduration=4.066585025 podStartE2EDuration="4.066585025s" podCreationTimestamp="2026-03-12 18:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:43.061577879 +0000 UTC m=+1263.430204202" watchObservedRunningTime="2026-03-12 18:23:43.066585025 +0000 UTC m=+1263.435211358" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.082283 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.082265117 podStartE2EDuration="4.082265117s" podCreationTimestamp="2026-03-12 18:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:43.081212805 +0000 UTC m=+1263.449839138" watchObservedRunningTime="2026-03-12 18:23:43.082265117 +0000 UTC m=+1263.450891450" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.172031 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:43 crc kubenswrapper[4926]: I0312 18:23:43.890232 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fc6496dbc-7qwrw"] Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.070979 4926 generic.go:334] "Generic (PLEG): container finished" podID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerID="a12ed3e178db267afb04649b5718c1d2bcc9a78faa88e5e9519c5b32b47a1362" exitCode=0 Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.071024 4926 generic.go:334] "Generic (PLEG): container finished" podID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerID="48ea775ffdf8bca7487b51cdb7c8d987f732c9702e1c95ad138a8eebbbab7c90" exitCode=143 Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.071065 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78cc2e75-39e6-4148-87a5-022cc3690da8","Type":"ContainerDied","Data":"a12ed3e178db267afb04649b5718c1d2bcc9a78faa88e5e9519c5b32b47a1362"} Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.071104 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78cc2e75-39e6-4148-87a5-022cc3690da8","Type":"ContainerDied","Data":"48ea775ffdf8bca7487b51cdb7c8d987f732c9702e1c95ad138a8eebbbab7c90"} Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.071114 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78cc2e75-39e6-4148-87a5-022cc3690da8","Type":"ContainerDied","Data":"d2b1d21f05ea6cb158f8a2f7033f48e548508b5ca903be77119e981940c0caa4"} Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.071122 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2b1d21f05ea6cb158f8a2f7033f48e548508b5ca903be77119e981940c0caa4" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.083258 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fc6496dbc-7qwrw" event={"ID":"bf1502ef-50a4-45e0-b193-a6e25abccb32","Type":"ContainerStarted","Data":"37d389d09dc11310aca6959f9e567d011711a2fc227393db818ccf9f61003f62"} Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.111074 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183223 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg65z\" (UniqueName: \"kubernetes.io/projected/78cc2e75-39e6-4148-87a5-022cc3690da8-kube-api-access-jg65z\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183268 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78cc2e75-39e6-4148-87a5-022cc3690da8-etc-machine-id\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183322 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183427 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-scripts\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183443 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78cc2e75-39e6-4148-87a5-022cc3690da8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183469 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data-custom\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183525 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-combined-ca-bundle\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.183541 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78cc2e75-39e6-4148-87a5-022cc3690da8-logs\") pod \"78cc2e75-39e6-4148-87a5-022cc3690da8\" (UID: \"78cc2e75-39e6-4148-87a5-022cc3690da8\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.184050 4926 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78cc2e75-39e6-4148-87a5-022cc3690da8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.187009 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78cc2e75-39e6-4148-87a5-022cc3690da8-logs" (OuterVolumeSpecName: "logs") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.216558 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-scripts" (OuterVolumeSpecName: "scripts") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.222318 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.227824 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cc2e75-39e6-4148-87a5-022cc3690da8-kube-api-access-jg65z" (OuterVolumeSpecName: "kube-api-access-jg65z") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "kube-api-access-jg65z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.287592 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.287621 4926 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.287630 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78cc2e75-39e6-4148-87a5-022cc3690da8-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.287638 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg65z\" (UniqueName: \"kubernetes.io/projected/78cc2e75-39e6-4148-87a5-022cc3690da8-kube-api-access-jg65z\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.291803 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.329403 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data" (OuterVolumeSpecName: "config-data") pod "78cc2e75-39e6-4148-87a5-022cc3690da8" (UID: "78cc2e75-39e6-4148-87a5-022cc3690da8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.389630 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.389668 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78cc2e75-39e6-4148-87a5-022cc3690da8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.558815 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.658457 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-65c5c86775-mct68" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.771266 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.901471 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-combined-ca-bundle\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.901753 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-sg-core-conf-yaml\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.901873 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-log-httpd\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.901957 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-scripts\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.901984 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-config-data\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.902025 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-run-httpd\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.902070 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrs2\" (UniqueName: \"kubernetes.io/projected/5fdfffa4-937c-4167-8545-d34f2007fbc9-kube-api-access-smrs2\") pod \"5fdfffa4-937c-4167-8545-d34f2007fbc9\" (UID: \"5fdfffa4-937c-4167-8545-d34f2007fbc9\") " Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.902385 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.902507 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.914093 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-scripts" (OuterVolumeSpecName: "scripts") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.917628 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fdfffa4-937c-4167-8545-d34f2007fbc9-kube-api-access-smrs2" (OuterVolumeSpecName: "kube-api-access-smrs2") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "kube-api-access-smrs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:44 crc kubenswrapper[4926]: I0312 18:23:44.950097 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.006115 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrs2\" (UniqueName: \"kubernetes.io/projected/5fdfffa4-937c-4167-8545-d34f2007fbc9-kube-api-access-smrs2\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.006150 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.006160 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.006170 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.006178 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5fdfffa4-937c-4167-8545-d34f2007fbc9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.051808 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.075745 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-config-data" (OuterVolumeSpecName: "config-data") pod "5fdfffa4-937c-4167-8545-d34f2007fbc9" (UID: "5fdfffa4-937c-4167-8545-d34f2007fbc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.108744 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.108793 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdfffa4-937c-4167-8545-d34f2007fbc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.114883 4926 generic.go:334] "Generic (PLEG): container finished" podID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerID="aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84" exitCode=0 Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.114948 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerDied","Data":"aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84"} Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.114976 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5fdfffa4-937c-4167-8545-d34f2007fbc9","Type":"ContainerDied","Data":"fde586bc97eba127841e218475be2e383628bdf6feeb7595fc6e0b0daaf96628"} Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.114992 4926 scope.go:117] "RemoveContainer" containerID="d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.115111 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.135651 4926 generic.go:334] "Generic (PLEG): container finished" podID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerID="8231f1ef7dfab85032bd8e8cc8a2434c72aeff5f8977a2307df4d52b313ce335" exitCode=0 Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.135713 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9b55c586b-s7wqs" event={"ID":"2f1f2a42-878e-46c0-bd66-4927a4689299","Type":"ContainerDied","Data":"8231f1ef7dfab85032bd8e8cc8a2434c72aeff5f8977a2307df4d52b313ce335"} Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.137960 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fc6496dbc-7qwrw" event={"ID":"bf1502ef-50a4-45e0-b193-a6e25abccb32","Type":"ContainerStarted","Data":"35a613e6b532db3489cd114f2dff0c75f70bff1b029f6056046d9fba6744021e"} Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.138019 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fc6496dbc-7qwrw" event={"ID":"bf1502ef-50a4-45e0-b193-a6e25abccb32","Type":"ContainerStarted","Data":"0867465e0603cad4755aa2a9a92b705554f77ad30ccd6ba55d0ec179f8c1aab3"} Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.138054 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.138129 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.171046 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7fc6496dbc-7qwrw" podStartSLOduration=3.17102843 podStartE2EDuration="3.17102843s" podCreationTimestamp="2026-03-12 18:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:45.169873295 +0000 UTC m=+1265.538499628" watchObservedRunningTime="2026-03-12 18:23:45.17102843 +0000 UTC m=+1265.539654763" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.211851 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.211988 4926 scope.go:117] "RemoveContainer" containerID="a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.220789 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.238655 4926 scope.go:117] "RemoveContainer" containerID="aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.247167 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.260465 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272284 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.272830 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api-log" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272856 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api-log" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.272872 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-central-agent" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272879 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-central-agent" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.272893 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272899 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.272908 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="proxy-httpd" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272914 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="proxy-httpd" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.272922 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-notification-agent" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272932 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-notification-agent" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.272959 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="sg-core" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.272967 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="sg-core" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.273173 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api-log" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.273192 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-central-agent" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.273207 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="sg-core" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.273218 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" containerName="cinder-api" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.273234 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="ceilometer-notification-agent" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.273252 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" containerName="proxy-httpd" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.275403 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.281430 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.281578 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.281673 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.290920 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.294815 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.296783 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.297203 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.298311 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.299676 4926 scope.go:117] "RemoveContainer" containerID="3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.300188 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.333664 4926 scope.go:117] "RemoveContainer" containerID="d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.334418 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef\": container with ID starting with d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef not found: ID does not exist" containerID="d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.334463 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef"} err="failed to get container status \"d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef\": rpc error: code = NotFound desc = could not find container \"d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef\": container with ID starting with d6ab048172eca2cb50eb57623ff1095082421f1f4c5137f11d5c1497dd206bef not found: ID does not exist" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.334485 4926 scope.go:117] "RemoveContainer" containerID="a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.335356 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b\": container with ID starting with a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b not found: ID does not exist" containerID="a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.335379 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b"} err="failed to get container status \"a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b\": rpc error: code = NotFound desc = could not find container \"a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b\": container with ID starting with a28f9bce7474ed0af2327b3b39f88ffd436344c5e5a64a5327965e743ea69f3b not found: ID does not exist" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.335394 4926 scope.go:117] "RemoveContainer" containerID="aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.336292 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84\": container with ID starting with aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84 not found: ID does not exist" containerID="aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.336320 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84"} err="failed to get container status \"aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84\": rpc error: code = NotFound desc = could not find container \"aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84\": container with ID starting with aff7f75f8e01b74292a231c7e54734896e7fa5d7cb06847602f5501bd5510f84 not found: ID does not exist" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.336338 4926 scope.go:117] "RemoveContainer" containerID="3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c" Mar 12 18:23:45 crc kubenswrapper[4926]: E0312 18:23:45.337539 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c\": container with ID starting with 3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c not found: ID does not exist" containerID="3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.337564 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c"} err="failed to get container status \"3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c\": rpc error: code = NotFound desc = could not find container \"3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c\": container with ID starting with 3d2006e37fc8315276f34e264200f53f1fbfd12d88a77b340db54bf40d377b3c not found: ID does not exist" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418375 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmcf\" (UniqueName: \"kubernetes.io/projected/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-kube-api-access-8bmcf\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418411 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-run-httpd\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418464 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-config-data-custom\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418479 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418499 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-config-data\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418643 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418811 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418843 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd14509-9e82-40a2-aea4-c6ad4250be05-logs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418878 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edd14509-9e82-40a2-aea4-c6ad4250be05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418913 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-scripts\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.418987 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnv8k\" (UniqueName: \"kubernetes.io/projected/edd14509-9e82-40a2-aea4-c6ad4250be05-kube-api-access-lnv8k\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.419019 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-log-httpd\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.419034 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-scripts\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.419111 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-config-data\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.419162 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.419202 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.475660 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524377 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-config-data-custom\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524418 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524460 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-config-data\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524496 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524525 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524543 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd14509-9e82-40a2-aea4-c6ad4250be05-logs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524558 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edd14509-9e82-40a2-aea4-c6ad4250be05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524576 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-scripts\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524609 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnv8k\" (UniqueName: \"kubernetes.io/projected/edd14509-9e82-40a2-aea4-c6ad4250be05-kube-api-access-lnv8k\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524625 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-scripts\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524640 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-log-httpd\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524672 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-config-data\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524690 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524711 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524745 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmcf\" (UniqueName: \"kubernetes.io/projected/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-kube-api-access-8bmcf\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.524764 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-run-httpd\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.525282 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-run-httpd\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.531692 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd14509-9e82-40a2-aea4-c6ad4250be05-logs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.532601 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-log-httpd\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.533066 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.533206 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/edd14509-9e82-40a2-aea4-c6ad4250be05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.536338 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-config-data\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.536757 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-scripts\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.537031 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.537197 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-config-data-custom\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.537302 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-scripts\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.537413 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.538138 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.547619 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-config-data\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.548686 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edd14509-9e82-40a2-aea4-c6ad4250be05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.550734 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmcf\" (UniqueName: \"kubernetes.io/projected/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-kube-api-access-8bmcf\") pod \"ceilometer-0\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.551141 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnv8k\" (UniqueName: \"kubernetes.io/projected/edd14509-9e82-40a2-aea4-c6ad4250be05-kube-api-access-lnv8k\") pod \"cinder-api-0\" (UID: \"edd14509-9e82-40a2-aea4-c6ad4250be05\") " pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.626151 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldmj\" (UniqueName: \"kubernetes.io/projected/2f1f2a42-878e-46c0-bd66-4927a4689299-kube-api-access-qldmj\") pod \"2f1f2a42-878e-46c0-bd66-4927a4689299\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.626206 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data\") pod \"2f1f2a42-878e-46c0-bd66-4927a4689299\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.626264 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data-custom\") pod \"2f1f2a42-878e-46c0-bd66-4927a4689299\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.626298 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-combined-ca-bundle\") pod \"2f1f2a42-878e-46c0-bd66-4927a4689299\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.626384 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1f2a42-878e-46c0-bd66-4927a4689299-logs\") pod \"2f1f2a42-878e-46c0-bd66-4927a4689299\" (UID: \"2f1f2a42-878e-46c0-bd66-4927a4689299\") " Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.627417 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f1f2a42-878e-46c0-bd66-4927a4689299-logs" (OuterVolumeSpecName: "logs") pod "2f1f2a42-878e-46c0-bd66-4927a4689299" (UID: "2f1f2a42-878e-46c0-bd66-4927a4689299"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.630267 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2f1f2a42-878e-46c0-bd66-4927a4689299" (UID: "2f1f2a42-878e-46c0-bd66-4927a4689299"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.633011 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.637657 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1f2a42-878e-46c0-bd66-4927a4689299-kube-api-access-qldmj" (OuterVolumeSpecName: "kube-api-access-qldmj") pod "2f1f2a42-878e-46c0-bd66-4927a4689299" (UID: "2f1f2a42-878e-46c0-bd66-4927a4689299"). InnerVolumeSpecName "kube-api-access-qldmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.663055 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f1f2a42-878e-46c0-bd66-4927a4689299" (UID: "2f1f2a42-878e-46c0-bd66-4927a4689299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.673901 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.696695 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data" (OuterVolumeSpecName: "config-data") pod "2f1f2a42-878e-46c0-bd66-4927a4689299" (UID: "2f1f2a42-878e-46c0-bd66-4927a4689299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.728581 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f1f2a42-878e-46c0-bd66-4927a4689299-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.728614 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldmj\" (UniqueName: \"kubernetes.io/projected/2f1f2a42-878e-46c0-bd66-4927a4689299-kube-api-access-qldmj\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.728625 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.728636 4926 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:45 crc kubenswrapper[4926]: I0312 18:23:45.728646 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1f2a42-878e-46c0-bd66-4927a4689299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.151165 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9b55c586b-s7wqs" event={"ID":"2f1f2a42-878e-46c0-bd66-4927a4689299","Type":"ContainerDied","Data":"28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc"} Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.151157 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9b55c586b-s7wqs" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.151450 4926 scope.go:117] "RemoveContainer" containerID="8231f1ef7dfab85032bd8e8cc8a2434c72aeff5f8977a2307df4d52b313ce335" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.204989 4926 scope.go:117] "RemoveContainer" containerID="0ae594d417f33a2af41173b2f1260c8120ad7232590b0279e7d83cb0dbe67a9c" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.216618 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9b55c586b-s7wqs"] Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.228867 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9b55c586b-s7wqs"] Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.265431 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.279717 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.505944 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" path="/var/lib/kubelet/pods/2f1f2a42-878e-46c0-bd66-4927a4689299/volumes" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.506595 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fdfffa4-937c-4167-8545-d34f2007fbc9" path="/var/lib/kubelet/pods/5fdfffa4-937c-4167-8545-d34f2007fbc9/volumes" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.509559 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cc2e75-39e6-4148-87a5-022cc3690da8" path="/var/lib/kubelet/pods/78cc2e75-39e6-4148-87a5-022cc3690da8/volumes" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.562175 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:23:46 crc kubenswrapper[4926]: I0312 18:23:46.602280 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:23:47 crc kubenswrapper[4926]: I0312 18:23:47.167865 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerStarted","Data":"f1898b8c34b4b06f2e10cb4f43fffdb55131049338d63dbe39254ab53b9a2db0"} Mar 12 18:23:47 crc kubenswrapper[4926]: I0312 18:23:47.170306 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"edd14509-9e82-40a2-aea4-c6ad4250be05","Type":"ContainerStarted","Data":"e10b6da3dd1ff8440bdbc1a8379a0386e7115d448c1e89e355b494fc9b57f163"} Mar 12 18:23:47 crc kubenswrapper[4926]: I0312 18:23:47.170329 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"edd14509-9e82-40a2-aea4-c6ad4250be05","Type":"ContainerStarted","Data":"deaafe388600bb6e713e42d64fd7e99daa40243a141341603a4a24f22c85d95b"} Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.190568 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"edd14509-9e82-40a2-aea4-c6ad4250be05","Type":"ContainerStarted","Data":"72ce98f1d314783c0caefbbcaa07ad14bc6f88c8f23b466acb4769e4fb57365f"} Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.190942 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.196391 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerStarted","Data":"a367820b72291dce3847ddbcc2b3a9f8493b17ddb9b80ab37f0c0b123bd5c72e"} Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.196453 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerStarted","Data":"a9eba08d9bcde00929ca468e28634454c84f19ace5c5074a2b2f0456e3403fef"} Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.216630 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.216611607 podStartE2EDuration="3.216611607s" podCreationTimestamp="2026-03-12 18:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:48.213096097 +0000 UTC m=+1268.581722440" watchObservedRunningTime="2026-03-12 18:23:48.216611607 +0000 UTC m=+1268.585237940" Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.310642 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5c6848d8cd-cq57n" Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.369974 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-89554fb64-s9c6q"] Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.370208 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon-log" containerID="cri-o://a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62" gracePeriod=30 Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.370681 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" containerID="cri-o://2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5" gracePeriod=30 Mar 12 18:23:48 crc kubenswrapper[4926]: I0312 18:23:48.383030 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.048753 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.108826 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-httpd-config\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.108949 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-combined-ca-bundle\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.109148 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-ovndb-tls-certs\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.109656 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-internal-tls-certs\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.110375 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzgvh\" (UniqueName: \"kubernetes.io/projected/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-kube-api-access-nzgvh\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.110416 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-config\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.110495 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-public-tls-certs\") pod \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\" (UID: \"2beed02e-2edf-4b52-8ea6-ae2dae7502d8\") " Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.114030 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.114453 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-kube-api-access-nzgvh" (OuterVolumeSpecName: "kube-api-access-nzgvh") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "kube-api-access-nzgvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.175796 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.187977 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.202505 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.209400 4926 generic.go:334] "Generic (PLEG): container finished" podID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerID="94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e" exitCode=0 Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.209475 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65c5c86775-mct68" event={"ID":"2beed02e-2edf-4b52-8ea6-ae2dae7502d8","Type":"ContainerDied","Data":"94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e"} Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.209502 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-65c5c86775-mct68" event={"ID":"2beed02e-2edf-4b52-8ea6-ae2dae7502d8","Type":"ContainerDied","Data":"7b5dc2f13fd79b24bbd53ad5c8577f6fcd9167f1b3fea4e0a8139bb248f187ac"} Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.209518 4926 scope.go:117] "RemoveContainer" containerID="d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.209678 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-65c5c86775-mct68" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.213028 4926 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.213060 4926 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.213068 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.213079 4926 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.213090 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzgvh\" (UniqueName: \"kubernetes.io/projected/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-kube-api-access-nzgvh\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.219101 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerStarted","Data":"8a7449dd308bb99ab10627007eaa35755447eacebb3f5175eab4f716527295a3"} Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.228281 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-config" (OuterVolumeSpecName: "config") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.233943 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2beed02e-2edf-4b52-8ea6-ae2dae7502d8" (UID: "2beed02e-2edf-4b52-8ea6-ae2dae7502d8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.296192 4926 scope.go:117] "RemoveContainer" containerID="94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.315322 4926 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.315374 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2beed02e-2edf-4b52-8ea6-ae2dae7502d8-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.319742 4926 scope.go:117] "RemoveContainer" containerID="d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217" Mar 12 18:23:49 crc kubenswrapper[4926]: E0312 18:23:49.320160 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217\": container with ID starting with d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217 not found: ID does not exist" containerID="d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.320188 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217"} err="failed to get container status \"d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217\": rpc error: code = NotFound desc = could not find container \"d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217\": container with ID starting with d88e7111c6324f8b6f31f6801ff8076b863ecd2243e9702f347b8ae37763d217 not found: ID does not exist" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.320212 4926 scope.go:117] "RemoveContainer" containerID="94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e" Mar 12 18:23:49 crc kubenswrapper[4926]: E0312 18:23:49.320525 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e\": container with ID starting with 94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e not found: ID does not exist" containerID="94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.320549 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e"} err="failed to get container status \"94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e\": rpc error: code = NotFound desc = could not find container \"94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e\": container with ID starting with 94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e not found: ID does not exist" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.555738 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-65c5c86775-mct68"] Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.564795 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-65c5c86775-mct68"] Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.624690 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.690206 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gbmf8"] Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.694610 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerName="dnsmasq-dns" containerID="cri-o://3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b" gracePeriod=10 Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.870228 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.916372 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:49 crc kubenswrapper[4926]: I0312 18:23:49.976001 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.210069 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.234048 4926 generic.go:334] "Generic (PLEG): container finished" podID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerID="3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b" exitCode=0 Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.234139 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" event={"ID":"08abfe56-0e5c-4634-9a1a-488e2bbb587d","Type":"ContainerDied","Data":"3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b"} Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.234183 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" event={"ID":"08abfe56-0e5c-4634-9a1a-488e2bbb587d","Type":"ContainerDied","Data":"07e4278fe40ae8695fb94d6a2d5bec78c86b69a3255ccbd19aa22a7865e54876"} Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.234198 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07e4278fe40ae8695fb94d6a2d5bec78c86b69a3255ccbd19aa22a7865e54876" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.234263 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="cinder-scheduler" containerID="cri-o://5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc" gracePeriod=30 Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.234611 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="probe" containerID="cri-o://433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74" gracePeriod=30 Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.263740 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.334721 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxtxm\" (UniqueName: \"kubernetes.io/projected/08abfe56-0e5c-4634-9a1a-488e2bbb587d-kube-api-access-pxtxm\") pod \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.334837 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-config\") pod \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.334959 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-sb\") pod \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.335005 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-svc\") pod \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.335099 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-swift-storage-0\") pod \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.335130 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-nb\") pod \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\" (UID: \"08abfe56-0e5c-4634-9a1a-488e2bbb587d\") " Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.340587 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08abfe56-0e5c-4634-9a1a-488e2bbb587d-kube-api-access-pxtxm" (OuterVolumeSpecName: "kube-api-access-pxtxm") pod "08abfe56-0e5c-4634-9a1a-488e2bbb587d" (UID: "08abfe56-0e5c-4634-9a1a-488e2bbb587d"). InnerVolumeSpecName "kube-api-access-pxtxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.391797 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08abfe56-0e5c-4634-9a1a-488e2bbb587d" (UID: "08abfe56-0e5c-4634-9a1a-488e2bbb587d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.392685 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-config" (OuterVolumeSpecName: "config") pod "08abfe56-0e5c-4634-9a1a-488e2bbb587d" (UID: "08abfe56-0e5c-4634-9a1a-488e2bbb587d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.396033 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08abfe56-0e5c-4634-9a1a-488e2bbb587d" (UID: "08abfe56-0e5c-4634-9a1a-488e2bbb587d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.414025 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08abfe56-0e5c-4634-9a1a-488e2bbb587d" (UID: "08abfe56-0e5c-4634-9a1a-488e2bbb587d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.439574 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.439650 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.439665 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.439677 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxtxm\" (UniqueName: \"kubernetes.io/projected/08abfe56-0e5c-4634-9a1a-488e2bbb587d-kube-api-access-pxtxm\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.439711 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485399 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6fdd856968-nmxxn"] Mar 12 18:23:50 crc kubenswrapper[4926]: E0312 18:23:50.485786 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerName="init" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485801 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerName="init" Mar 12 18:23:50 crc kubenswrapper[4926]: E0312 18:23:50.485817 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api-log" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485823 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api-log" Mar 12 18:23:50 crc kubenswrapper[4926]: E0312 18:23:50.485834 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-api" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485840 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-api" Mar 12 18:23:50 crc kubenswrapper[4926]: E0312 18:23:50.485853 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerName="dnsmasq-dns" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485859 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerName="dnsmasq-dns" Mar 12 18:23:50 crc kubenswrapper[4926]: E0312 18:23:50.485868 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-httpd" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485873 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-httpd" Mar 12 18:23:50 crc kubenswrapper[4926]: E0312 18:23:50.485888 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.485894 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.486054 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-api" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.486066 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" containerName="neutron-httpd" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.486074 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api-log" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.486085 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" containerName="dnsmasq-dns" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.486095 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1f2a42-878e-46c0-bd66-4927a4689299" containerName="barbican-api" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.486975 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.502288 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2beed02e-2edf-4b52-8ea6-ae2dae7502d8" path="/var/lib/kubelet/pods/2beed02e-2edf-4b52-8ea6-ae2dae7502d8/volumes" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.508420 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08abfe56-0e5c-4634-9a1a-488e2bbb587d" (UID: "08abfe56-0e5c-4634-9a1a-488e2bbb587d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.514291 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fdd856968-nmxxn"] Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542669 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-internal-tls-certs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542811 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s94wd\" (UniqueName: \"kubernetes.io/projected/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-kube-api-access-s94wd\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542848 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-logs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542874 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-config-data\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542895 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-combined-ca-bundle\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542914 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-scripts\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542928 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-public-tls-certs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.542988 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08abfe56-0e5c-4634-9a1a-488e2bbb587d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644286 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s94wd\" (UniqueName: \"kubernetes.io/projected/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-kube-api-access-s94wd\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644365 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-logs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644397 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-config-data\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644447 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-combined-ca-bundle\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644474 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-scripts\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644496 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-public-tls-certs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.644549 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-internal-tls-certs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.645189 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-logs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.647911 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-scripts\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.648970 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-internal-tls-certs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.649110 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-combined-ca-bundle\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.650082 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-config-data\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.654212 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-public-tls-certs\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.665099 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s94wd\" (UniqueName: \"kubernetes.io/projected/0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81-kube-api-access-s94wd\") pod \"placement-6fdd856968-nmxxn\" (UID: \"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81\") " pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:50 crc kubenswrapper[4926]: I0312 18:23:50.819872 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.250535 4926 generic.go:334] "Generic (PLEG): container finished" podID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerID="433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74" exitCode=0 Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.251094 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-gbmf8" Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.250652 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c","Type":"ContainerDied","Data":"433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74"} Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.319163 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fdd856968-nmxxn"] Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.357675 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gbmf8"] Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.368691 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-gbmf8"] Mar 12 18:23:51 crc kubenswrapper[4926]: I0312 18:23:51.839320 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:59182->10.217.0.152:8443: read: connection reset by peer" Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.259988 4926 generic.go:334] "Generic (PLEG): container finished" podID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerID="2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5" exitCode=0 Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.260071 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89554fb64-s9c6q" event={"ID":"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572","Type":"ContainerDied","Data":"2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5"} Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.261399 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fdd856968-nmxxn" event={"ID":"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81","Type":"ContainerStarted","Data":"16d5e379b276abe6132ccec2a9f5cb3df0aa08c2f32851e225b26b48de0a209e"} Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.261535 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fdd856968-nmxxn" event={"ID":"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81","Type":"ContainerStarted","Data":"2030315c68b9ec5e49c8e54e4477a702ee63c1c1a1199c8231f96ac1189cd4d4"} Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.261588 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fdd856968-nmxxn" event={"ID":"0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81","Type":"ContainerStarted","Data":"2975e38110c330439c82a74fcdb5b43027e0eb2466522cad66f4767299630c42"} Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.261772 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.261807 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.263771 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerStarted","Data":"27c0e5b7f4011faa0c2b0ee4f54cfe990c1a5e30f0bbe162914f8380a857f6ef"} Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.263940 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.283015 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6fdd856968-nmxxn" podStartSLOduration=2.282994966 podStartE2EDuration="2.282994966s" podCreationTimestamp="2026-03-12 18:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:52.279542968 +0000 UTC m=+1272.648169311" watchObservedRunningTime="2026-03-12 18:23:52.282994966 +0000 UTC m=+1272.651621299" Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.308825 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.471354642 podStartE2EDuration="7.308801256s" podCreationTimestamp="2026-03-12 18:23:45 +0000 UTC" firstStartedPulling="2026-03-12 18:23:46.30459024 +0000 UTC m=+1266.673216573" lastFinishedPulling="2026-03-12 18:23:51.142036854 +0000 UTC m=+1271.510663187" observedRunningTime="2026-03-12 18:23:52.303313623 +0000 UTC m=+1272.671939956" watchObservedRunningTime="2026-03-12 18:23:52.308801256 +0000 UTC m=+1272.677427579" Mar 12 18:23:52 crc kubenswrapper[4926]: I0312 18:23:52.500238 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08abfe56-0e5c-4634-9a1a-488e2bbb587d" path="/var/lib/kubelet/pods/08abfe56-0e5c-4634-9a1a-488e2bbb587d/volumes" Mar 12 18:23:53 crc kubenswrapper[4926]: I0312 18:23:53.217671 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.282033 4926 generic.go:334] "Generic (PLEG): container finished" podID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerID="5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc" exitCode=0 Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.282112 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c","Type":"ContainerDied","Data":"5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc"} Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.685554 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.736732 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-scripts\") pod \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.736806 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data-custom\") pod \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.736992 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpn6w\" (UniqueName: \"kubernetes.io/projected/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-kube-api-access-jpn6w\") pod \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.737058 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-etc-machine-id\") pod \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.737139 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-combined-ca-bundle\") pod \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.737175 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data\") pod \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\" (UID: \"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c\") " Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.739906 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" (UID: "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.765542 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-kube-api-access-jpn6w" (OuterVolumeSpecName: "kube-api-access-jpn6w") pod "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" (UID: "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c"). InnerVolumeSpecName "kube-api-access-jpn6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.765613 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-scripts" (OuterVolumeSpecName: "scripts") pod "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" (UID: "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.784583 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" (UID: "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.817156 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" (UID: "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.840039 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpn6w\" (UniqueName: \"kubernetes.io/projected/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-kube-api-access-jpn6w\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.840072 4926 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.840082 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.840093 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.840104 4926 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.885114 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data" (OuterVolumeSpecName: "config-data") pod "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" (UID: "1a13f5e0-72f6-4c47-a5ea-349c2d618d8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:23:54 crc kubenswrapper[4926]: I0312 18:23:54.941861 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.292705 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1a13f5e0-72f6-4c47-a5ea-349c2d618d8c","Type":"ContainerDied","Data":"b5b79d1f13639d906d19091084d8e1ea42e28328a4dc8a574f851692ee3f9f8c"} Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.292768 4926 scope.go:117] "RemoveContainer" containerID="433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.292815 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.349106 4926 scope.go:117] "RemoveContainer" containerID="5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.355553 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.375423 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.383906 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:55 crc kubenswrapper[4926]: E0312 18:23:55.384367 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="cinder-scheduler" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.384385 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="cinder-scheduler" Mar 12 18:23:55 crc kubenswrapper[4926]: E0312 18:23:55.384417 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="probe" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.384426 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="probe" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.384693 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="probe" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.384719 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" containerName="cinder-scheduler" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.385762 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.387745 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.397261 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.450137 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.450204 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b586d98-12e7-4814-82c6-6724e1b35a77-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.450241 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.450274 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktwx7\" (UniqueName: \"kubernetes.io/projected/7b586d98-12e7-4814-82c6-6724e1b35a77-kube-api-access-ktwx7\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.450293 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.450704 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.552557 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.553259 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.553293 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b586d98-12e7-4814-82c6-6724e1b35a77-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.553352 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b586d98-12e7-4814-82c6-6724e1b35a77-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.553384 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.553499 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktwx7\" (UniqueName: \"kubernetes.io/projected/7b586d98-12e7-4814-82c6-6724e1b35a77-kube-api-access-ktwx7\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.553534 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.558976 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.567093 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.569254 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.569916 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b586d98-12e7-4814-82c6-6724e1b35a77-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.570943 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktwx7\" (UniqueName: \"kubernetes.io/projected/7b586d98-12e7-4814-82c6-6724e1b35a77-kube-api-access-ktwx7\") pod \"cinder-scheduler-0\" (UID: \"7b586d98-12e7-4814-82c6-6724e1b35a77\") " pod="openstack/cinder-scheduler-0" Mar 12 18:23:55 crc kubenswrapper[4926]: I0312 18:23:55.707967 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 18:23:56 crc kubenswrapper[4926]: I0312 18:23:56.220452 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 18:23:56 crc kubenswrapper[4926]: I0312 18:23:56.314614 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b586d98-12e7-4814-82c6-6724e1b35a77","Type":"ContainerStarted","Data":"b092141ebd6c3282eb64208c6d0eada485cc3f4e87f9e320f3e0527985ac5d9b"} Mar 12 18:23:56 crc kubenswrapper[4926]: I0312 18:23:56.500068 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a13f5e0-72f6-4c47-a5ea-349c2d618d8c" path="/var/lib/kubelet/pods/1a13f5e0-72f6-4c47-a5ea-349c2d618d8c/volumes" Mar 12 18:23:56 crc kubenswrapper[4926]: I0312 18:23:56.817515 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:23:56 crc kubenswrapper[4926]: I0312 18:23:56.817571 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:23:57 crc kubenswrapper[4926]: I0312 18:23:57.328976 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b586d98-12e7-4814-82c6-6724e1b35a77","Type":"ContainerStarted","Data":"88264eae8cfce90c518596add7ceb9977069fc99141c87436e9831d2980346ae"} Mar 12 18:23:57 crc kubenswrapper[4926]: I0312 18:23:57.388025 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b667c464b-fk8sc" Mar 12 18:23:57 crc kubenswrapper[4926]: I0312 18:23:57.745250 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 12 18:23:58 crc kubenswrapper[4926]: I0312 18:23:58.340584 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b586d98-12e7-4814-82c6-6724e1b35a77","Type":"ContainerStarted","Data":"3c43768b9e52861674aa32825296172a82245451794d4d7809a7351482d03074"} Mar 12 18:23:58 crc kubenswrapper[4926]: I0312 18:23:58.372106 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.372083915 podStartE2EDuration="3.372083915s" podCreationTimestamp="2026-03-12 18:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:23:58.356927939 +0000 UTC m=+1278.725554292" watchObservedRunningTime="2026-03-12 18:23:58.372083915 +0000 UTC m=+1278.740710248" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.143595 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555664-xqlj7"] Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.144719 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.149196 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.150390 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.150623 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.173350 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555664-xqlj7"] Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.257907 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznnp\" (UniqueName: \"kubernetes.io/projected/d0a93b50-2038-4bf8-8c5f-bc77148d55f8-kube-api-access-gznnp\") pod \"auto-csr-approver-29555664-xqlj7\" (UID: \"d0a93b50-2038-4bf8-8c5f-bc77148d55f8\") " pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.359860 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gznnp\" (UniqueName: \"kubernetes.io/projected/d0a93b50-2038-4bf8-8c5f-bc77148d55f8-kube-api-access-gznnp\") pod \"auto-csr-approver-29555664-xqlj7\" (UID: \"d0a93b50-2038-4bf8-8c5f-bc77148d55f8\") " pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.384498 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznnp\" (UniqueName: \"kubernetes.io/projected/d0a93b50-2038-4bf8-8c5f-bc77148d55f8-kube-api-access-gznnp\") pod \"auto-csr-approver-29555664-xqlj7\" (UID: \"d0a93b50-2038-4bf8-8c5f-bc77148d55f8\") " pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.476641 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.708226 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.922113 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-bbdc94bc7-pmtmm"] Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.925248 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.932427 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.932621 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.932699 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.943322 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bbdc94bc7-pmtmm"] Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971078 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-config-data\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971164 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-public-tls-certs\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971198 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-combined-ca-bundle\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971236 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5xmq\" (UniqueName: \"kubernetes.io/projected/09eb7b3b-c5af-4625-8f1a-83766550711c-kube-api-access-x5xmq\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971329 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09eb7b3b-c5af-4625-8f1a-83766550711c-etc-swift\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971467 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09eb7b3b-c5af-4625-8f1a-83766550711c-log-httpd\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971583 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-internal-tls-certs\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.971614 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09eb7b3b-c5af-4625-8f1a-83766550711c-run-httpd\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:00 crc kubenswrapper[4926]: I0312 18:24:00.998384 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555664-xqlj7"] Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072691 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-internal-tls-certs\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072759 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09eb7b3b-c5af-4625-8f1a-83766550711c-run-httpd\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072817 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-config-data\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072875 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-public-tls-certs\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072906 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-combined-ca-bundle\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072949 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5xmq\" (UniqueName: \"kubernetes.io/projected/09eb7b3b-c5af-4625-8f1a-83766550711c-kube-api-access-x5xmq\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.072986 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09eb7b3b-c5af-4625-8f1a-83766550711c-etc-swift\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.073032 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09eb7b3b-c5af-4625-8f1a-83766550711c-log-httpd\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.073265 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09eb7b3b-c5af-4625-8f1a-83766550711c-run-httpd\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.073561 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/09eb7b3b-c5af-4625-8f1a-83766550711c-log-httpd\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.089535 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-internal-tls-certs\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.091205 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/09eb7b3b-c5af-4625-8f1a-83766550711c-etc-swift\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.093486 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-public-tls-certs\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.094321 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-combined-ca-bundle\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.099525 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5xmq\" (UniqueName: \"kubernetes.io/projected/09eb7b3b-c5af-4625-8f1a-83766550711c-kube-api-access-x5xmq\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.099951 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09eb7b3b-c5af-4625-8f1a-83766550711c-config-data\") pod \"swift-proxy-bbdc94bc7-pmtmm\" (UID: \"09eb7b3b-c5af-4625-8f1a-83766550711c\") " pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.250820 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.369481 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" event={"ID":"d0a93b50-2038-4bf8-8c5f-bc77148d55f8","Type":"ContainerStarted","Data":"b73f746aeaacae651cc800c031298955a2feeae1651f82a533f65b94b8415760"} Mar 12 18:24:01 crc kubenswrapper[4926]: I0312 18:24:01.820801 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bbdc94bc7-pmtmm"] Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.049499 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.051141 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.053409 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-82l47" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.059017 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.059313 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.059495 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.064712 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.094970 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da936793-13b1-4815-a1ec-4d5d609ca5e3-logs\") pod \"da936793-13b1-4815-a1ec-4d5d609ca5e3\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095047 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data-custom\") pod \"da936793-13b1-4815-a1ec-4d5d609ca5e3\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095144 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m7fh\" (UniqueName: \"kubernetes.io/projected/da936793-13b1-4815-a1ec-4d5d609ca5e3-kube-api-access-2m7fh\") pod \"da936793-13b1-4815-a1ec-4d5d609ca5e3\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095215 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-combined-ca-bundle\") pod \"da936793-13b1-4815-a1ec-4d5d609ca5e3\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095270 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data\") pod \"da936793-13b1-4815-a1ec-4d5d609ca5e3\" (UID: \"da936793-13b1-4815-a1ec-4d5d609ca5e3\") " Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095477 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ac368f3-42fb-4f4a-ba68-1686386b017e-openstack-config-secret\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095545 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ac368f3-42fb-4f4a-ba68-1686386b017e-openstack-config\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095602 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9kxz\" (UniqueName: \"kubernetes.io/projected/8ac368f3-42fb-4f4a-ba68-1686386b017e-kube-api-access-s9kxz\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.095620 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac368f3-42fb-4f4a-ba68-1686386b017e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.096279 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da936793-13b1-4815-a1ec-4d5d609ca5e3-logs" (OuterVolumeSpecName: "logs") pod "da936793-13b1-4815-a1ec-4d5d609ca5e3" (UID: "da936793-13b1-4815-a1ec-4d5d609ca5e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.102726 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da936793-13b1-4815-a1ec-4d5d609ca5e3" (UID: "da936793-13b1-4815-a1ec-4d5d609ca5e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.108493 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da936793-13b1-4815-a1ec-4d5d609ca5e3-kube-api-access-2m7fh" (OuterVolumeSpecName: "kube-api-access-2m7fh") pod "da936793-13b1-4815-a1ec-4d5d609ca5e3" (UID: "da936793-13b1-4815-a1ec-4d5d609ca5e3"). InnerVolumeSpecName "kube-api-access-2m7fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.149768 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da936793-13b1-4815-a1ec-4d5d609ca5e3" (UID: "da936793-13b1-4815-a1ec-4d5d609ca5e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.188901 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data" (OuterVolumeSpecName: "config-data") pod "da936793-13b1-4815-a1ec-4d5d609ca5e3" (UID: "da936793-13b1-4815-a1ec-4d5d609ca5e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.198625 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ac368f3-42fb-4f4a-ba68-1686386b017e-openstack-config-secret\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.198871 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ac368f3-42fb-4f4a-ba68-1686386b017e-openstack-config\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199104 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9kxz\" (UniqueName: \"kubernetes.io/projected/8ac368f3-42fb-4f4a-ba68-1686386b017e-kube-api-access-s9kxz\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199147 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac368f3-42fb-4f4a-ba68-1686386b017e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199361 4926 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199406 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m7fh\" (UniqueName: \"kubernetes.io/projected/da936793-13b1-4815-a1ec-4d5d609ca5e3-kube-api-access-2m7fh\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199429 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199471 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da936793-13b1-4815-a1ec-4d5d609ca5e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.199489 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da936793-13b1-4815-a1ec-4d5d609ca5e3-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.200667 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8ac368f3-42fb-4f4a-ba68-1686386b017e-openstack-config\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.206419 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8ac368f3-42fb-4f4a-ba68-1686386b017e-openstack-config-secret\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.206978 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac368f3-42fb-4f4a-ba68-1686386b017e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.218652 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9kxz\" (UniqueName: \"kubernetes.io/projected/8ac368f3-42fb-4f4a-ba68-1686386b017e-kube-api-access-s9kxz\") pod \"openstackclient\" (UID: \"8ac368f3-42fb-4f4a-ba68-1686386b017e\") " pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.374351 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.380473 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" event={"ID":"09eb7b3b-c5af-4625-8f1a-83766550711c","Type":"ContainerStarted","Data":"0bd4229260fb8fe792c73d8e69cf5442f931f736c0b59deda9ffc0fef5be3a78"} Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.380515 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" event={"ID":"09eb7b3b-c5af-4625-8f1a-83766550711c","Type":"ContainerStarted","Data":"f60997be790e511a986a9ab57afe7a3d5ed6fd416e480154b493f4664ba9afab"} Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.395585 4926 generic.go:334] "Generic (PLEG): container finished" podID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerID="7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8" exitCode=137 Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.395642 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.395641 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" event={"ID":"da936793-13b1-4815-a1ec-4d5d609ca5e3","Type":"ContainerDied","Data":"7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8"} Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.396369 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-94956785d-mtl2w" event={"ID":"da936793-13b1-4815-a1ec-4d5d609ca5e3","Type":"ContainerDied","Data":"01f48ef206a2d7f242f74c69621856e49171cd0d33097cfbec64d9c1935505bd"} Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.396403 4926 scope.go:117] "RemoveContainer" containerID="7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.552426 4926 scope.go:117] "RemoveContainer" containerID="38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.564746 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-94956785d-mtl2w"] Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.573661 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-94956785d-mtl2w"] Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.618074 4926 scope.go:117] "RemoveContainer" containerID="7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8" Mar 12 18:24:02 crc kubenswrapper[4926]: E0312 18:24:02.623502 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8\": container with ID starting with 7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8 not found: ID does not exist" containerID="7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.623557 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8"} err="failed to get container status \"7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8\": rpc error: code = NotFound desc = could not find container \"7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8\": container with ID starting with 7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8 not found: ID does not exist" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.623600 4926 scope.go:117] "RemoveContainer" containerID="38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2" Mar 12 18:24:02 crc kubenswrapper[4926]: E0312 18:24:02.624109 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2\": container with ID starting with 38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2 not found: ID does not exist" containerID="38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.624154 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2"} err="failed to get container status \"38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2\": rpc error: code = NotFound desc = could not find container \"38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2\": container with ID starting with 38486c735a1623333651d8064ff2479a19bcc2c864a0885d812a7528e1e135d2 not found: ID does not exist" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.861564 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.861863 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-central-agent" containerID="cri-o://a367820b72291dce3847ddbcc2b3a9f8493b17ddb9b80ab37f0c0b123bd5c72e" gracePeriod=30 Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.862507 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="proxy-httpd" containerID="cri-o://27c0e5b7f4011faa0c2b0ee4f54cfe990c1a5e30f0bbe162914f8380a857f6ef" gracePeriod=30 Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.862649 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-notification-agent" containerID="cri-o://a9eba08d9bcde00929ca468e28634454c84f19ace5c5074a2b2f0456e3403fef" gracePeriod=30 Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.862663 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="sg-core" containerID="cri-o://8a7449dd308bb99ab10627007eaa35755447eacebb3f5175eab4f716527295a3" gracePeriod=30 Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.873576 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.173:3000/\": EOF" Mar 12 18:24:02 crc kubenswrapper[4926]: I0312 18:24:02.913027 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.218089 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.406469 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8ac368f3-42fb-4f4a-ba68-1686386b017e","Type":"ContainerStarted","Data":"858d0c3a8121a26a1a23cb8cc32005f95c2ce3b6c8d15c7165700931ba74ca6f"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.409916 4926 generic.go:334] "Generic (PLEG): container finished" podID="d0a93b50-2038-4bf8-8c5f-bc77148d55f8" containerID="e0b6e5ac15bb107790c6fb8dd27667c6693088633c203e8fcc884a1e78a79525" exitCode=0 Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.409976 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" event={"ID":"d0a93b50-2038-4bf8-8c5f-bc77148d55f8","Type":"ContainerDied","Data":"e0b6e5ac15bb107790c6fb8dd27667c6693088633c203e8fcc884a1e78a79525"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.421418 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" event={"ID":"09eb7b3b-c5af-4625-8f1a-83766550711c","Type":"ContainerStarted","Data":"cb5574985a441766effb628172e6b7ca47066f959ac3872ce7b092cf6411844d"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.422714 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.422755 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447177 4926 generic.go:334] "Generic (PLEG): container finished" podID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerID="27c0e5b7f4011faa0c2b0ee4f54cfe990c1a5e30f0bbe162914f8380a857f6ef" exitCode=0 Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447212 4926 generic.go:334] "Generic (PLEG): container finished" podID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerID="8a7449dd308bb99ab10627007eaa35755447eacebb3f5175eab4f716527295a3" exitCode=2 Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447221 4926 generic.go:334] "Generic (PLEG): container finished" podID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerID="a9eba08d9bcde00929ca468e28634454c84f19ace5c5074a2b2f0456e3403fef" exitCode=0 Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447227 4926 generic.go:334] "Generic (PLEG): container finished" podID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerID="a367820b72291dce3847ddbcc2b3a9f8493b17ddb9b80ab37f0c0b123bd5c72e" exitCode=0 Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447286 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerDied","Data":"27c0e5b7f4011faa0c2b0ee4f54cfe990c1a5e30f0bbe162914f8380a857f6ef"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447326 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerDied","Data":"8a7449dd308bb99ab10627007eaa35755447eacebb3f5175eab4f716527295a3"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447336 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerDied","Data":"a9eba08d9bcde00929ca468e28634454c84f19ace5c5074a2b2f0456e3403fef"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.447345 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerDied","Data":"a367820b72291dce3847ddbcc2b3a9f8493b17ddb9b80ab37f0c0b123bd5c72e"} Mar 12 18:24:03 crc kubenswrapper[4926]: I0312 18:24:03.459286 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" podStartSLOduration=3.459256735 podStartE2EDuration="3.459256735s" podCreationTimestamp="2026-03-12 18:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:24:03.44350779 +0000 UTC m=+1283.812134123" watchObservedRunningTime="2026-03-12 18:24:03.459256735 +0000 UTC m=+1283.827883068" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.010915 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.182376 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-scripts\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.183067 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-combined-ca-bundle\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.183151 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bmcf\" (UniqueName: \"kubernetes.io/projected/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-kube-api-access-8bmcf\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.183720 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-log-httpd\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.183775 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-config-data\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.183840 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-sg-core-conf-yaml\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.183993 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-run-httpd\") pod \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\" (UID: \"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d\") " Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.184093 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.185048 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.185282 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.190340 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-kube-api-access-8bmcf" (OuterVolumeSpecName: "kube-api-access-8bmcf") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "kube-api-access-8bmcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.213709 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-scripts" (OuterVolumeSpecName: "scripts") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.224534 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.287868 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.287897 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bmcf\" (UniqueName: \"kubernetes.io/projected/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-kube-api-access-8bmcf\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.287907 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.287921 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.288004 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.312550 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-config-data" (OuterVolumeSpecName: "config-data") pod "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" (UID: "a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.389318 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.389369 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.467267 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d","Type":"ContainerDied","Data":"f1898b8c34b4b06f2e10cb4f43fffdb55131049338d63dbe39254ab53b9a2db0"} Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.467384 4926 scope.go:117] "RemoveContainer" containerID="27c0e5b7f4011faa0c2b0ee4f54cfe990c1a5e30f0bbe162914f8380a857f6ef" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.467759 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.499150 4926 scope.go:117] "RemoveContainer" containerID="8a7449dd308bb99ab10627007eaa35755447eacebb3f5175eab4f716527295a3" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.558072 4926 scope.go:117] "RemoveContainer" containerID="a9eba08d9bcde00929ca468e28634454c84f19ace5c5074a2b2f0456e3403fef" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.603543 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" path="/var/lib/kubelet/pods/da936793-13b1-4815-a1ec-4d5d609ca5e3/volumes" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.604287 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.604317 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.604336 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:04 crc kubenswrapper[4926]: E0312 18:24:04.605082 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605105 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener" Mar 12 18:24:04 crc kubenswrapper[4926]: E0312 18:24:04.605151 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-notification-agent" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605179 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-notification-agent" Mar 12 18:24:04 crc kubenswrapper[4926]: E0312 18:24:04.605205 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-central-agent" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605211 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-central-agent" Mar 12 18:24:04 crc kubenswrapper[4926]: E0312 18:24:04.605236 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener-log" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605243 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener-log" Mar 12 18:24:04 crc kubenswrapper[4926]: E0312 18:24:04.605257 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="proxy-httpd" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605266 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="proxy-httpd" Mar 12 18:24:04 crc kubenswrapper[4926]: E0312 18:24:04.605287 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="sg-core" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605293 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="sg-core" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605641 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605670 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-central-agent" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605696 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="proxy-httpd" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605711 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="sg-core" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605723 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" containerName="ceilometer-notification-agent" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.605738 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="da936793-13b1-4815-a1ec-4d5d609ca5e3" containerName="barbican-keystone-listener-log" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.609674 4926 scope.go:117] "RemoveContainer" containerID="a367820b72291dce3847ddbcc2b3a9f8493b17ddb9b80ab37f0c0b123bd5c72e" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.635897 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.636081 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.638879 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.639159 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715107 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715526 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-run-httpd\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715629 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-log-httpd\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715650 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-config-data\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715704 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-scripts\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715801 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.715828 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb8t5\" (UniqueName: \"kubernetes.io/projected/785370ab-aa78-4362-adf4-fedc8a0aedf9-kube-api-access-lb8t5\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817300 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-scripts\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817429 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817492 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb8t5\" (UniqueName: \"kubernetes.io/projected/785370ab-aa78-4362-adf4-fedc8a0aedf9-kube-api-access-lb8t5\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817540 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817579 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-run-httpd\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817736 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-config-data\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.817760 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-log-httpd\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.818310 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-log-httpd\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.821080 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-run-httpd\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.826381 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.832559 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.832715 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-config-data\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.834177 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-scripts\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.842213 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb8t5\" (UniqueName: \"kubernetes.io/projected/785370ab-aa78-4362-adf4-fedc8a0aedf9-kube-api-access-lb8t5\") pod \"ceilometer-0\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " pod="openstack/ceilometer-0" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.917117 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:04 crc kubenswrapper[4926]: I0312 18:24:04.971599 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.020716 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznnp\" (UniqueName: \"kubernetes.io/projected/d0a93b50-2038-4bf8-8c5f-bc77148d55f8-kube-api-access-gznnp\") pod \"d0a93b50-2038-4bf8-8c5f-bc77148d55f8\" (UID: \"d0a93b50-2038-4bf8-8c5f-bc77148d55f8\") " Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.023834 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a93b50-2038-4bf8-8c5f-bc77148d55f8-kube-api-access-gznnp" (OuterVolumeSpecName: "kube-api-access-gznnp") pod "d0a93b50-2038-4bf8-8c5f-bc77148d55f8" (UID: "d0a93b50-2038-4bf8-8c5f-bc77148d55f8"). InnerVolumeSpecName "kube-api-access-gznnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.123546 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gznnp\" (UniqueName: \"kubernetes.io/projected/d0a93b50-2038-4bf8-8c5f-bc77148d55f8-kube-api-access-gznnp\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.465883 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.482260 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.482256 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555664-xqlj7" event={"ID":"d0a93b50-2038-4bf8-8c5f-bc77148d55f8","Type":"ContainerDied","Data":"b73f746aeaacae651cc800c031298955a2feeae1651f82a533f65b94b8415760"} Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.482336 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b73f746aeaacae651cc800c031298955a2feeae1651f82a533f65b94b8415760" Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.966569 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 18:24:05 crc kubenswrapper[4926]: I0312 18:24:05.993383 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555658-gfgnc"] Mar 12 18:24:06 crc kubenswrapper[4926]: I0312 18:24:06.008341 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555658-gfgnc"] Mar 12 18:24:06 crc kubenswrapper[4926]: I0312 18:24:06.506580 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b727d6f-6c9d-44ad-8594-accc9d2c4ed6" path="/var/lib/kubelet/pods/4b727d6f-6c9d-44ad-8594-accc9d2c4ed6/volumes" Mar 12 18:24:06 crc kubenswrapper[4926]: I0312 18:24:06.510368 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d" path="/var/lib/kubelet/pods/a64c41ad-a2b0-46ae-8d4c-fa0b01b8649d/volumes" Mar 12 18:24:06 crc kubenswrapper[4926]: I0312 18:24:06.512240 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerStarted","Data":"122b7926eeb56b52d198ea8fb2b2bcfa65e9df33208dc1088b72c07833d13975"} Mar 12 18:24:06 crc kubenswrapper[4926]: I0312 18:24:06.512288 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerStarted","Data":"b830a701f318d482f59209b1a64738cbe2648c8287a45c52a5d2852a56c3420e"} Mar 12 18:24:07 crc kubenswrapper[4926]: I0312 18:24:07.503000 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerStarted","Data":"2edaaa27e784ed522859eae2d6ff079de37094800a6747ee19848273e81ce307"} Mar 12 18:24:08 crc kubenswrapper[4926]: I0312 18:24:08.515869 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerStarted","Data":"fb25c14164a8c733a356a95d30484debd145ffe5ef07e6cd41f8828fbbe4674f"} Mar 12 18:24:11 crc kubenswrapper[4926]: I0312 18:24:11.257313 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:11 crc kubenswrapper[4926]: I0312 18:24:11.257651 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" Mar 12 18:24:12 crc kubenswrapper[4926]: I0312 18:24:12.129932 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:12 crc kubenswrapper[4926]: I0312 18:24:12.377292 4926 scope.go:117] "RemoveContainer" containerID="287cfd2e2e77a9b487765ae984d056f483b704272f97333956b53dc7f101b02d" Mar 12 18:24:13 crc kubenswrapper[4926]: I0312 18:24:13.190922 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7fc6496dbc-7qwrw" Mar 12 18:24:13 crc kubenswrapper[4926]: I0312 18:24:13.217719 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-89554fb64-s9c6q" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 12 18:24:13 crc kubenswrapper[4926]: I0312 18:24:13.298142 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-884f7b65b-tpkzl"] Mar 12 18:24:13 crc kubenswrapper[4926]: I0312 18:24:13.298346 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-884f7b65b-tpkzl" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-api" containerID="cri-o://e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84" gracePeriod=30 Mar 12 18:24:13 crc kubenswrapper[4926]: I0312 18:24:13.298930 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-884f7b65b-tpkzl" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-httpd" containerID="cri-o://d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529" gracePeriod=30 Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.573920 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8ac368f3-42fb-4f4a-ba68-1686386b017e","Type":"ContainerStarted","Data":"5e31eedfcc536e64dcd654a034ae030854b30d0c69f093c2fd0e94322552616d"} Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.576921 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerStarted","Data":"a5363a011abc4afc5c22e60ffd3c571ee4c452c8a1940dab6a8d1fc67f9718cb"} Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.577096 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-central-agent" containerID="cri-o://122b7926eeb56b52d198ea8fb2b2bcfa65e9df33208dc1088b72c07833d13975" gracePeriod=30 Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.577162 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.577179 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="proxy-httpd" containerID="cri-o://a5363a011abc4afc5c22e60ffd3c571ee4c452c8a1940dab6a8d1fc67f9718cb" gracePeriod=30 Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.577222 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-notification-agent" containerID="cri-o://2edaaa27e784ed522859eae2d6ff079de37094800a6747ee19848273e81ce307" gracePeriod=30 Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.577282 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="sg-core" containerID="cri-o://fb25c14164a8c733a356a95d30484debd145ffe5ef07e6cd41f8828fbbe4674f" gracePeriod=30 Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.580159 4926 generic.go:334] "Generic (PLEG): container finished" podID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerID="d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529" exitCode=0 Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.580197 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-884f7b65b-tpkzl" event={"ID":"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea","Type":"ContainerDied","Data":"d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529"} Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.590886 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.803950591 podStartE2EDuration="12.590869652s" podCreationTimestamp="2026-03-12 18:24:02 +0000 UTC" firstStartedPulling="2026-03-12 18:24:02.893795322 +0000 UTC m=+1283.262421655" lastFinishedPulling="2026-03-12 18:24:13.680714383 +0000 UTC m=+1294.049340716" observedRunningTime="2026-03-12 18:24:14.590510601 +0000 UTC m=+1294.959136934" watchObservedRunningTime="2026-03-12 18:24:14.590869652 +0000 UTC m=+1294.959495985" Mar 12 18:24:14 crc kubenswrapper[4926]: I0312 18:24:14.620380 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.403011626 podStartE2EDuration="10.620355737s" podCreationTimestamp="2026-03-12 18:24:04 +0000 UTC" firstStartedPulling="2026-03-12 18:24:05.484472274 +0000 UTC m=+1285.853098607" lastFinishedPulling="2026-03-12 18:24:13.701816385 +0000 UTC m=+1294.070442718" observedRunningTime="2026-03-12 18:24:14.614416631 +0000 UTC m=+1294.983042984" watchObservedRunningTime="2026-03-12 18:24:14.620355737 +0000 UTC m=+1294.988982070" Mar 12 18:24:14 crc kubenswrapper[4926]: E0312 18:24:14.716241 4926 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/c2e53ff075e298480760552cda2544f1e9b4139d48e822beda05f83234508b59/diff" to get inode usage: stat /var/lib/containers/storage/overlay/c2e53ff075e298480760552cda2544f1e9b4139d48e822beda05f83234508b59/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_ceilometer-0_5fdfffa4-937c-4167-8545-d34f2007fbc9/ceilometer-notification-agent/0.log" to get inode usage: stat /var/log/pods/openstack_ceilometer-0_5fdfffa4-937c-4167-8545-d34f2007fbc9/ceilometer-notification-agent/0.log: no such file or directory Mar 12 18:24:15 crc kubenswrapper[4926]: E0312 18:24:15.338133 4926 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/eb3ee65486eee702d32220ece4785ba2ffa0d61d40fd91e9ccbf68ec12ce3c76/diff" to get inode usage: stat /var/lib/containers/storage/overlay/eb3ee65486eee702d32220ece4785ba2ffa0d61d40fd91e9ccbf68ec12ce3c76/diff: no such file or directory, extraDiskErr: Mar 12 18:24:15 crc kubenswrapper[4926]: E0312 18:24:15.498654 4926 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/3848119c075ad08b2848fa09f9b39831f837a8bb14ca7b0747e0d453e8757f2e/diff" to get inode usage: stat /var/lib/containers/storage/overlay/3848119c075ad08b2848fa09f9b39831f837a8bb14ca7b0747e0d453e8757f2e/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_neutron-65c5c86775-mct68_2beed02e-2edf-4b52-8ea6-ae2dae7502d8/neutron-api/0.log" to get inode usage: stat /var/log/pods/openstack_neutron-65c5c86775-mct68_2beed02e-2edf-4b52-8ea6-ae2dae7502d8/neutron-api/0.log: no such file or directory Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597578 4926 generic.go:334] "Generic (PLEG): container finished" podID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerID="a5363a011abc4afc5c22e60ffd3c571ee4c452c8a1940dab6a8d1fc67f9718cb" exitCode=0 Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597624 4926 generic.go:334] "Generic (PLEG): container finished" podID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerID="fb25c14164a8c733a356a95d30484debd145ffe5ef07e6cd41f8828fbbe4674f" exitCode=2 Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597635 4926 generic.go:334] "Generic (PLEG): container finished" podID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerID="2edaaa27e784ed522859eae2d6ff079de37094800a6747ee19848273e81ce307" exitCode=0 Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597634 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerDied","Data":"a5363a011abc4afc5c22e60ffd3c571ee4c452c8a1940dab6a8d1fc67f9718cb"} Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597685 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerDied","Data":"fb25c14164a8c733a356a95d30484debd145ffe5ef07e6cd41f8828fbbe4674f"} Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597700 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerDied","Data":"2edaaa27e784ed522859eae2d6ff079de37094800a6747ee19848273e81ce307"} Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597713 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerDied","Data":"122b7926eeb56b52d198ea8fb2b2bcfa65e9df33208dc1088b72c07833d13975"} Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.597649 4926 generic.go:334] "Generic (PLEG): container finished" podID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerID="122b7926eeb56b52d198ea8fb2b2bcfa65e9df33208dc1088b72c07833d13975" exitCode=0 Mar 12 18:24:15 crc kubenswrapper[4926]: E0312 18:24:15.777537 4926 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/83a98e7d89ad57b3c990c9d485f73210c416f72d87846ea5492e22a752654c27/diff" to get inode usage: stat /var/lib/containers/storage/overlay/83a98e7d89ad57b3c990c9d485f73210c416f72d87846ea5492e22a752654c27/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_neutron-65c5c86775-mct68_2beed02e-2edf-4b52-8ea6-ae2dae7502d8/neutron-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_neutron-65c5c86775-mct68_2beed02e-2edf-4b52-8ea6-ae2dae7502d8/neutron-httpd/0.log: no such file or directory Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.786130 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849405 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-config-data\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849587 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-combined-ca-bundle\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849609 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-log-httpd\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849696 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-scripts\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849727 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb8t5\" (UniqueName: \"kubernetes.io/projected/785370ab-aa78-4362-adf4-fedc8a0aedf9-kube-api-access-lb8t5\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849762 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-sg-core-conf-yaml\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.849807 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-run-httpd\") pod \"785370ab-aa78-4362-adf4-fedc8a0aedf9\" (UID: \"785370ab-aa78-4362-adf4-fedc8a0aedf9\") " Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.850747 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.851413 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.854761 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-scripts" (OuterVolumeSpecName: "scripts") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.869783 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785370ab-aa78-4362-adf4-fedc8a0aedf9-kube-api-access-lb8t5" (OuterVolumeSpecName: "kube-api-access-lb8t5") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "kube-api-access-lb8t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.901152 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.951726 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.951767 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb8t5\" (UniqueName: \"kubernetes.io/projected/785370ab-aa78-4362-adf4-fedc8a0aedf9-kube-api-access-lb8t5\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.951781 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.951792 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.951804 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/785370ab-aa78-4362-adf4-fedc8a0aedf9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.955794 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:15 crc kubenswrapper[4926]: I0312 18:24:15.975604 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-config-data" (OuterVolumeSpecName: "config-data") pod "785370ab-aa78-4362-adf4-fedc8a0aedf9" (UID: "785370ab-aa78-4362-adf4-fedc8a0aedf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.053359 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.053390 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785370ab-aa78-4362-adf4-fedc8a0aedf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.607107 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"785370ab-aa78-4362-adf4-fedc8a0aedf9","Type":"ContainerDied","Data":"b830a701f318d482f59209b1a64738cbe2648c8287a45c52a5d2852a56c3420e"} Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.607380 4926 scope.go:117] "RemoveContainer" containerID="a5363a011abc4afc5c22e60ffd3c571ee4c452c8a1940dab6a8d1fc67f9718cb" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.607302 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.633129 4926 scope.go:117] "RemoveContainer" containerID="fb25c14164a8c733a356a95d30484debd145ffe5ef07e6cd41f8828fbbe4674f" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.639487 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.648242 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.660760 4926 scope.go:117] "RemoveContainer" containerID="2edaaa27e784ed522859eae2d6ff079de37094800a6747ee19848273e81ce307" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.668961 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:16 crc kubenswrapper[4926]: E0312 18:24:16.669352 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a93b50-2038-4bf8-8c5f-bc77148d55f8" containerName="oc" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669372 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a93b50-2038-4bf8-8c5f-bc77148d55f8" containerName="oc" Mar 12 18:24:16 crc kubenswrapper[4926]: E0312 18:24:16.669386 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="proxy-httpd" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669392 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="proxy-httpd" Mar 12 18:24:16 crc kubenswrapper[4926]: E0312 18:24:16.669402 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-central-agent" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669408 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-central-agent" Mar 12 18:24:16 crc kubenswrapper[4926]: E0312 18:24:16.669419 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-notification-agent" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669425 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-notification-agent" Mar 12 18:24:16 crc kubenswrapper[4926]: E0312 18:24:16.669450 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="sg-core" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669457 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="sg-core" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669624 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a93b50-2038-4bf8-8c5f-bc77148d55f8" containerName="oc" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669637 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-central-agent" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669649 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="ceilometer-notification-agent" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669662 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="proxy-httpd" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.669673 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" containerName="sg-core" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.671379 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.674311 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.674513 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.690765 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.743383 4926 scope.go:117] "RemoveContainer" containerID="122b7926eeb56b52d198ea8fb2b2bcfa65e9df33208dc1088b72c07833d13975" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768165 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768210 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9v4\" (UniqueName: \"kubernetes.io/projected/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-kube-api-access-7b9v4\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768275 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-run-httpd\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768310 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-scripts\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768333 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-log-httpd\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768373 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.768448 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-config-data\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871628 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-run-httpd\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871710 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-scripts\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871738 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-log-httpd\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871782 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871819 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-config-data\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871894 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.871927 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b9v4\" (UniqueName: \"kubernetes.io/projected/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-kube-api-access-7b9v4\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.872583 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-log-httpd\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.873933 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-run-httpd\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.878928 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-scripts\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.879157 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.880020 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-config-data\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.880057 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:16 crc kubenswrapper[4926]: I0312 18:24:16.892419 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b9v4\" (UniqueName: \"kubernetes.io/projected/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-kube-api-access-7b9v4\") pod \"ceilometer-0\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " pod="openstack/ceilometer-0" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.098004 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.309665 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.380137 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vz6v\" (UniqueName: \"kubernetes.io/projected/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-kube-api-access-7vz6v\") pod \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.380194 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-httpd-config\") pod \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.380247 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-combined-ca-bundle\") pod \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.380291 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-config\") pod \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.380319 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-ovndb-tls-certs\") pod \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\" (UID: \"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea\") " Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.386871 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" (UID: "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.391668 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-kube-api-access-7vz6v" (OuterVolumeSpecName: "kube-api-access-7vz6v") pod "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" (UID: "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea"). InnerVolumeSpecName "kube-api-access-7vz6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.437255 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" (UID: "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.444841 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-config" (OuterVolumeSpecName: "config") pod "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" (UID: "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.451748 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" (UID: "5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.482190 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vz6v\" (UniqueName: \"kubernetes.io/projected/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-kube-api-access-7vz6v\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.482224 4926 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.482234 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.482244 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.482252 4926 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.588130 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.616415 4926 generic.go:334] "Generic (PLEG): container finished" podID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerID="e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84" exitCode=0 Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.616491 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-884f7b65b-tpkzl" event={"ID":"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea","Type":"ContainerDied","Data":"e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84"} Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.616525 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-884f7b65b-tpkzl" event={"ID":"5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea","Type":"ContainerDied","Data":"4598f84bb743a934c385d17beee4641ea28d3b613e4146bda41f42d7c8f987e3"} Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.616547 4926 scope.go:117] "RemoveContainer" containerID="d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.616647 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-884f7b65b-tpkzl" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.631311 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerStarted","Data":"7996e6fe65ac453b2bf7a0036550739c5607aa7bde6261bcc4407d93c97a4ca7"} Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.655501 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-884f7b65b-tpkzl"] Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.657304 4926 scope.go:117] "RemoveContainer" containerID="e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.664503 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-884f7b65b-tpkzl"] Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.677814 4926 scope.go:117] "RemoveContainer" containerID="d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529" Mar 12 18:24:17 crc kubenswrapper[4926]: E0312 18:24:17.678184 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529\": container with ID starting with d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529 not found: ID does not exist" containerID="d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.678237 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529"} err="failed to get container status \"d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529\": rpc error: code = NotFound desc = could not find container \"d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529\": container with ID starting with d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529 not found: ID does not exist" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.678268 4926 scope.go:117] "RemoveContainer" containerID="e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84" Mar 12 18:24:17 crc kubenswrapper[4926]: E0312 18:24:17.678602 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84\": container with ID starting with e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84 not found: ID does not exist" containerID="e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84" Mar 12 18:24:17 crc kubenswrapper[4926]: I0312 18:24:17.678633 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84"} err="failed to get container status \"e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84\": rpc error: code = NotFound desc = could not find container \"e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84\": container with ID starting with e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84 not found: ID does not exist" Mar 12 18:24:18 crc kubenswrapper[4926]: E0312 18:24:18.421231 4926 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_cinder-api-0_78cc2e75-39e6-4148-87a5-022cc3690da8/cinder-api-log/0.log" to get inode usage: stat /var/log/pods/openstack_cinder-api-0_78cc2e75-39e6-4148-87a5-022cc3690da8/cinder-api-log/0.log: no such file or directory Mar 12 18:24:18 crc kubenswrapper[4926]: W0312 18:24:18.421420 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a13f5e0_72f6_4c47_a5ea_349c2d618d8c.slice/crio-5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc.scope WatchSource:0}: Error finding container 5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc: Status 404 returned error can't find the container with id 5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc Mar 12 18:24:18 crc kubenswrapper[4926]: E0312 18:24:18.432486 4926 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1f2a42_878e_46c0_bd66_4927a4689299.slice/crio-28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc: Error finding container 28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc: Status 404 returned error can't find the container with id 28a435d769e66430d190ede24f5f603701f099bd6688244c585cc8bf6e7750fc Mar 12 18:24:18 crc kubenswrapper[4926]: E0312 18:24:18.432517 4926 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_cinder-api-0_78cc2e75-39e6-4148-87a5-022cc3690da8/cinder-api/0.log" to get inode usage: stat /var/log/pods/openstack_cinder-api-0_78cc2e75-39e6-4148-87a5-022cc3690da8/cinder-api/0.log: no such file or directory Mar 12 18:24:18 crc kubenswrapper[4926]: W0312 18:24:18.434236 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a13f5e0_72f6_4c47_a5ea_349c2d618d8c.slice/crio-433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74.scope WatchSource:0}: Error finding container 433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74: Status 404 returned error can't find the container with id 433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74 Mar 12 18:24:18 crc kubenswrapper[4926]: W0312 18:24:18.434930 4926 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64c41ad_a2b0_46ae_8d4c_fa0b01b8649d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64c41ad_a2b0_46ae_8d4c_fa0b01b8649d.slice: no such file or directory Mar 12 18:24:18 crc kubenswrapper[4926]: W0312 18:24:18.478890 4926 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a93b50_2038_4bf8_8c5f_bc77148d55f8.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a93b50_2038_4bf8_8c5f_bc77148d55f8.slice: no such file or directory Mar 12 18:24:18 crc kubenswrapper[4926]: W0312 18:24:18.499830 4926 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod785370ab_aa78_4362_adf4_fedc8a0aedf9.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod785370ab_aa78_4362_adf4_fedc8a0aedf9.slice: no such file or directory Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.529006 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" path="/var/lib/kubelet/pods/5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea/volumes" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.530853 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785370ab-aa78-4362-adf4-fedc8a0aedf9" path="/var/lib/kubelet/pods/785370ab-aa78-4362-adf4-fedc8a0aedf9/volumes" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.598326 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-55g54"] Mar 12 18:24:18 crc kubenswrapper[4926]: E0312 18:24:18.601963 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-httpd" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.601998 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-httpd" Mar 12 18:24:18 crc kubenswrapper[4926]: E0312 18:24:18.602036 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-api" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.602062 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-api" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.602274 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-api" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.602287 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8f91a-7f34-4957-b3e4-e5d28b2f43ea" containerName="neutron-httpd" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.609059 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.619985 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-55g54"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.656045 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerStarted","Data":"70089f60f5708dc42a17c61910ff76a140c92fff334f5b497df2b30ef8db681d"} Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.678201 4926 generic.go:334] "Generic (PLEG): container finished" podID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerID="a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62" exitCode=137 Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.678307 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89554fb64-s9c6q" event={"ID":"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572","Type":"ContainerDied","Data":"a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62"} Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.704089 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-t4krn"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.710249 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.713111 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb8f27f-336b-4acc-8165-0c53c1643084-operator-scripts\") pod \"nova-api-db-create-55g54\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.713209 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm9pf\" (UniqueName: \"kubernetes.io/projected/4cb8f27f-336b-4acc-8165-0c53c1643084-kube-api-access-jm9pf\") pod \"nova-api-db-create-55g54\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.734706 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-382f-account-create-update-7sn9z"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.736975 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.739244 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.748621 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t4krn"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.762574 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-382f-account-create-update-7sn9z"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.816638 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb8f27f-336b-4acc-8165-0c53c1643084-operator-scripts\") pod \"nova-api-db-create-55g54\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.816699 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfpj\" (UniqueName: \"kubernetes.io/projected/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-kube-api-access-hjfpj\") pod \"nova-api-382f-account-create-update-7sn9z\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.816759 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjlt\" (UniqueName: \"kubernetes.io/projected/15db9185-0441-431d-98db-5701f9b244be-kube-api-access-gbjlt\") pod \"nova-cell0-db-create-t4krn\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.816792 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm9pf\" (UniqueName: \"kubernetes.io/projected/4cb8f27f-336b-4acc-8165-0c53c1643084-kube-api-access-jm9pf\") pod \"nova-api-db-create-55g54\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.816825 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-operator-scripts\") pod \"nova-api-382f-account-create-update-7sn9z\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.816862 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15db9185-0441-431d-98db-5701f9b244be-operator-scripts\") pod \"nova-cell0-db-create-t4krn\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.817824 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb8f27f-336b-4acc-8165-0c53c1643084-operator-scripts\") pod \"nova-api-db-create-55g54\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.845683 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm9pf\" (UniqueName: \"kubernetes.io/projected/4cb8f27f-336b-4acc-8165-0c53c1643084-kube-api-access-jm9pf\") pod \"nova-api-db-create-55g54\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.898511 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-cctxg"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.900051 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.919184 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-operator-scripts\") pod \"nova-cell1-db-create-cctxg\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.919293 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfpj\" (UniqueName: \"kubernetes.io/projected/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-kube-api-access-hjfpj\") pod \"nova-api-382f-account-create-update-7sn9z\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.919413 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjlt\" (UniqueName: \"kubernetes.io/projected/15db9185-0441-431d-98db-5701f9b244be-kube-api-access-gbjlt\") pod \"nova-cell0-db-create-t4krn\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.919494 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-operator-scripts\") pod \"nova-api-382f-account-create-update-7sn9z\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.919577 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15db9185-0441-431d-98db-5701f9b244be-operator-scripts\") pod \"nova-cell0-db-create-t4krn\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.919665 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrpx\" (UniqueName: \"kubernetes.io/projected/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-kube-api-access-mxrpx\") pod \"nova-cell1-db-create-cctxg\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.920578 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15db9185-0441-431d-98db-5701f9b244be-operator-scripts\") pod \"nova-cell0-db-create-t4krn\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.920855 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-operator-scripts\") pod \"nova-api-382f-account-create-update-7sn9z\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.923618 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f45b-account-create-update-rt9j9"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.925105 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.926739 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.941819 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cctxg"] Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.942150 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfpj\" (UniqueName: \"kubernetes.io/projected/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-kube-api-access-hjfpj\") pod \"nova-api-382f-account-create-update-7sn9z\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.942629 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.944384 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjlt\" (UniqueName: \"kubernetes.io/projected/15db9185-0441-431d-98db-5701f9b244be-kube-api-access-gbjlt\") pod \"nova-cell0-db-create-t4krn\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:18 crc kubenswrapper[4926]: I0312 18:24:18.953311 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f45b-account-create-update-rt9j9"] Mar 12 18:24:18 crc kubenswrapper[4926]: E0312 18:24:18.988133 4926 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2beed02e_2edf_4b52_8ea6_ae2dae7502d8.slice/crio-94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice/crio-d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda936793_13b1_4815_a1ec_4d5d609ca5e3.slice/crio-7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7d07aa_8c5e_49f3_8d85_4c5e9569c572.slice/crio-2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7d07aa_8c5e_49f3_8d85_4c5e9569c572.slice/crio-conmon-a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7d07aa_8c5e_49f3_8d85_4c5e9569c572.slice/crio-conmon-2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda936793_13b1_4815_a1ec_4d5d609ca5e3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08abfe56_0e5c_4634_9a1a_488e2bbb587d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda936793_13b1_4815_a1ec_4d5d609ca5e3.slice/crio-01f48ef206a2d7f242f74c69621856e49171cd0d33097cfbec64d9c1935505bd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2beed02e_2edf_4b52_8ea6_ae2dae7502d8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a13f5e0_72f6_4c47_a5ea_349c2d618d8c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice/crio-e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc7d07aa_8c5e_49f3_8d85_4c5e9569c572.slice/crio-a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice/crio-conmon-e8a41592cfee0a29605ce4445cc98b991801ec37b06cc31b26495c7cc367ec84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice/crio-4598f84bb743a934c385d17beee4641ea28d3b613e4146bda41f42d7c8f987e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8f91a_7f34_4957_b3e4_e5d28b2f43ea.slice/crio-conmon-d0514254b366a8c0de092253a4eb8c31e2072ea4a0ebc8efb64f738ddebe6529.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda936793_13b1_4815_a1ec_4d5d609ca5e3.slice/crio-conmon-7d7476ed20c94347bce99f3f0ec729c3292d3ed1f2ba8fd6cc877698a95ad5b8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2beed02e_2edf_4b52_8ea6_ae2dae7502d8.slice/crio-conmon-94b08a85a6e907ae8377090a58dcd3e4401da2dbe17a5ae03f9e88d0b3d26f8e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a13f5e0_72f6_4c47_a5ea_349c2d618d8c.slice/crio-conmon-5a5c133b5ffc1b7921863c9e06663dd92d6d307bfb1967e1d619b5cabd0d99dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08abfe56_0e5c_4634_9a1a_488e2bbb587d.slice/crio-conmon-3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08abfe56_0e5c_4634_9a1a_488e2bbb587d.slice/crio-07e4278fe40ae8695fb94d6a2d5bec78c86b69a3255ccbd19aa22a7865e54876\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2beed02e_2edf_4b52_8ea6_ae2dae7502d8.slice/crio-7b5dc2f13fd79b24bbd53ad5c8577f6fcd9167f1b3fea4e0a8139bb248f187ac\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a13f5e0_72f6_4c47_a5ea_349c2d618d8c.slice/crio-conmon-433bb82141391ed92a9e550d2edd35bdc32cd216136772372e6f96c9b643ee74.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08abfe56_0e5c_4634_9a1a_488e2bbb587d.slice/crio-3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1f2a42_878e_46c0_bd66_4927a4689299.slice\": RecentStats: unable to find data in memory cache]" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.015233 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.023817 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-operator-scripts\") pod \"nova-cell0-f45b-account-create-update-rt9j9\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.023909 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrpx\" (UniqueName: \"kubernetes.io/projected/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-kube-api-access-mxrpx\") pod \"nova-cell1-db-create-cctxg\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.023974 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rll4\" (UniqueName: \"kubernetes.io/projected/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-kube-api-access-5rll4\") pod \"nova-cell0-f45b-account-create-update-rt9j9\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.024018 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-operator-scripts\") pod \"nova-cell1-db-create-cctxg\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.024729 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-operator-scripts\") pod \"nova-cell1-db-create-cctxg\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.026687 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.062818 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrpx\" (UniqueName: \"kubernetes.io/projected/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-kube-api-access-mxrpx\") pod \"nova-cell1-db-create-cctxg\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.084366 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.112425 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0630-account-create-update-knx9n"] Mar 12 18:24:19 crc kubenswrapper[4926]: E0312 18:24:19.112993 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon-log" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.113004 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon-log" Mar 12 18:24:19 crc kubenswrapper[4926]: E0312 18:24:19.113029 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.113035 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.113195 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.113205 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" containerName="horizon-log" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.113718 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.114907 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0630-account-create-update-knx9n"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.119546 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.125640 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-secret-key\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.125684 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-scripts\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.125705 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-logs\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.125864 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rll4\" (UniqueName: \"kubernetes.io/projected/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-kube-api-access-5rll4\") pod \"nova-cell0-f45b-account-create-update-rt9j9\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.125891 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a2618b-1e0a-4246-b63c-582d2fe2f847-operator-scripts\") pod \"nova-cell1-0630-account-create-update-knx9n\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.125961 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qwj\" (UniqueName: \"kubernetes.io/projected/29a2618b-1e0a-4246-b63c-582d2fe2f847-kube-api-access-p8qwj\") pod \"nova-cell1-0630-account-create-update-knx9n\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.126002 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-operator-scripts\") pod \"nova-cell0-f45b-account-create-update-rt9j9\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.126687 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-operator-scripts\") pod \"nova-cell0-f45b-account-create-update-rt9j9\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.127710 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-logs" (OuterVolumeSpecName: "logs") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.135811 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.148927 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rll4\" (UniqueName: \"kubernetes.io/projected/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-kube-api-access-5rll4\") pod \"nova-cell0-f45b-account-create-update-rt9j9\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.195200 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-scripts" (OuterVolumeSpecName: "scripts") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227136 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p78br\" (UniqueName: \"kubernetes.io/projected/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-kube-api-access-p78br\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227185 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-tls-certs\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227224 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-config-data\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227260 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-combined-ca-bundle\") pod \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\" (UID: \"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572\") " Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227483 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a2618b-1e0a-4246-b63c-582d2fe2f847-operator-scripts\") pod \"nova-cell1-0630-account-create-update-knx9n\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227549 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qwj\" (UniqueName: \"kubernetes.io/projected/29a2618b-1e0a-4246-b63c-582d2fe2f847-kube-api-access-p8qwj\") pod \"nova-cell1-0630-account-create-update-knx9n\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227642 4926 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227659 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.227669 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.228275 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a2618b-1e0a-4246-b63c-582d2fe2f847-operator-scripts\") pod \"nova-cell1-0630-account-create-update-knx9n\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.233359 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-kube-api-access-p78br" (OuterVolumeSpecName: "kube-api-access-p78br") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "kube-api-access-p78br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.243578 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qwj\" (UniqueName: \"kubernetes.io/projected/29a2618b-1e0a-4246-b63c-582d2fe2f847-kube-api-access-p8qwj\") pod \"nova-cell1-0630-account-create-update-knx9n\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.252915 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-config-data" (OuterVolumeSpecName: "config-data") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.294642 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.317539 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" (UID: "dc7d07aa-8c5e-49f3-8d85-4c5e9569c572"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.328949 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.329004 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p78br\" (UniqueName: \"kubernetes.io/projected/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-kube-api-access-p78br\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.329042 4926 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.329055 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.329068 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.341639 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.435288 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.462745 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-382f-account-create-update-7sn9z"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.588708 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-55g54"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.660581 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t4krn"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.757810 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89554fb64-s9c6q" event={"ID":"dc7d07aa-8c5e-49f3-8d85-4c5e9569c572","Type":"ContainerDied","Data":"47a3a3a3dc751bc7ac0243d80b1ecc826d87bf09a948cad8eeaa84ccc31fd39c"} Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.757856 4926 scope.go:117] "RemoveContainer" containerID="2388379aefe6f54987e9387c0ab60e55776d5a81c5e2c5b0f21608b48b6e8fa5" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.757996 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-89554fb64-s9c6q" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.776735 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t4krn" event={"ID":"15db9185-0441-431d-98db-5701f9b244be","Type":"ContainerStarted","Data":"6c9449dc088a16c50ac498f087e95dd4074e327717458b2a2dca40a15417614a"} Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.793601 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerStarted","Data":"309dce8d3efcb58229ff7be17feec841c39fb231db3ee8c9eaa9375400b8f496"} Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.800769 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cctxg"] Mar 12 18:24:19 crc kubenswrapper[4926]: W0312 18:24:19.822808 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc4b3e47_ab9e_46c0_be37_9d2e94721be6.slice/crio-e0b512c7b0ec080f65f1af63e533a0195e053d720c63ac72bbcdd0918ebd9be0 WatchSource:0}: Error finding container e0b512c7b0ec080f65f1af63e533a0195e053d720c63ac72bbcdd0918ebd9be0: Status 404 returned error can't find the container with id e0b512c7b0ec080f65f1af63e533a0195e053d720c63ac72bbcdd0918ebd9be0 Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.822914 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-382f-account-create-update-7sn9z" event={"ID":"fe0c0d35-e463-4b5c-a15f-cdfc808c498c","Type":"ContainerStarted","Data":"b1d04c89d3d9fb6caf62bdb2b456380c91864ceab753f46a51bad242a32a7521"} Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.832537 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-55g54" event={"ID":"4cb8f27f-336b-4acc-8165-0c53c1643084","Type":"ContainerStarted","Data":"c6f0363211c7d4e74e3ccbba5ec4240749e8ebe5592c2510b2fa60c1f08ec3c9"} Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.848552 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-382f-account-create-update-7sn9z" podStartSLOduration=1.848534211 podStartE2EDuration="1.848534211s" podCreationTimestamp="2026-03-12 18:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:24:19.836514084 +0000 UTC m=+1300.205140407" watchObservedRunningTime="2026-03-12 18:24:19.848534211 +0000 UTC m=+1300.217160544" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.853095 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-55g54" podStartSLOduration=1.853076104 podStartE2EDuration="1.853076104s" podCreationTimestamp="2026-03-12 18:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:24:19.849302775 +0000 UTC m=+1300.217929118" watchObservedRunningTime="2026-03-12 18:24:19.853076104 +0000 UTC m=+1300.221702427" Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.918524 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f45b-account-create-update-rt9j9"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.960248 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-89554fb64-s9c6q"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.975415 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-89554fb64-s9c6q"] Mar 12 18:24:19 crc kubenswrapper[4926]: I0312 18:24:19.993653 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0630-account-create-update-knx9n"] Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.077657 4926 scope.go:117] "RemoveContainer" containerID="a41e698428edf8a4dccc211b474a012677761bd1a52d186e37df8c3d1445ee62" Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.507333 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7d07aa-8c5e-49f3-8d85-4c5e9569c572" path="/var/lib/kubelet/pods/dc7d07aa-8c5e-49f3-8d85-4c5e9569c572/volumes" Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.852242 4926 generic.go:334] "Generic (PLEG): container finished" podID="29a2618b-1e0a-4246-b63c-582d2fe2f847" containerID="b0665b4cdb9e47cd07d74cad2d92a7eeb8aadf990846b2f50cc7f0a3a493d8dd" exitCode=0 Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.852747 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0630-account-create-update-knx9n" event={"ID":"29a2618b-1e0a-4246-b63c-582d2fe2f847","Type":"ContainerDied","Data":"b0665b4cdb9e47cd07d74cad2d92a7eeb8aadf990846b2f50cc7f0a3a493d8dd"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.852794 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0630-account-create-update-knx9n" event={"ID":"29a2618b-1e0a-4246-b63c-582d2fe2f847","Type":"ContainerStarted","Data":"ff34501c9af340574663fc5763d61f95a310be34a280df0b21b236386141ee47"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.855520 4926 generic.go:334] "Generic (PLEG): container finished" podID="dc4b3e47-ab9e-46c0-be37-9d2e94721be6" containerID="8476342456babac729c7086b287eb9b60e559eb560bedd1d78d42784e15f0ff2" exitCode=0 Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.855564 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cctxg" event={"ID":"dc4b3e47-ab9e-46c0-be37-9d2e94721be6","Type":"ContainerDied","Data":"8476342456babac729c7086b287eb9b60e559eb560bedd1d78d42784e15f0ff2"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.855584 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cctxg" event={"ID":"dc4b3e47-ab9e-46c0-be37-9d2e94721be6","Type":"ContainerStarted","Data":"e0b512c7b0ec080f65f1af63e533a0195e053d720c63ac72bbcdd0918ebd9be0"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.857314 4926 generic.go:334] "Generic (PLEG): container finished" podID="15db9185-0441-431d-98db-5701f9b244be" containerID="a9bf2ae52889d181443948b724481f92343c2befd9093bc8cc132900dface592" exitCode=0 Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.857349 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t4krn" event={"ID":"15db9185-0441-431d-98db-5701f9b244be","Type":"ContainerDied","Data":"a9bf2ae52889d181443948b724481f92343c2befd9093bc8cc132900dface592"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.859127 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerStarted","Data":"ef9ed90f72e8dfcc270f36331c95e42d9aa6057e9d8ae052ebb5501d32f7648b"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.864452 4926 generic.go:334] "Generic (PLEG): container finished" podID="fe0c0d35-e463-4b5c-a15f-cdfc808c498c" containerID="976b5bff6bf1beb72cdab8caedbb157ee653f67334ae07735bc5919349b4ae10" exitCode=0 Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.864497 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-382f-account-create-update-7sn9z" event={"ID":"fe0c0d35-e463-4b5c-a15f-cdfc808c498c","Type":"ContainerDied","Data":"976b5bff6bf1beb72cdab8caedbb157ee653f67334ae07735bc5919349b4ae10"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.866245 4926 generic.go:334] "Generic (PLEG): container finished" podID="b33ff126-1b0c-47f0-a4c1-18e1297fa81d" containerID="f80b93ed7cbf010c51c1bbf4a01b83fdc0dc54adf44c99408ef3f7476eea541b" exitCode=0 Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.866305 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" event={"ID":"b33ff126-1b0c-47f0-a4c1-18e1297fa81d","Type":"ContainerDied","Data":"f80b93ed7cbf010c51c1bbf4a01b83fdc0dc54adf44c99408ef3f7476eea541b"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.866319 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" event={"ID":"b33ff126-1b0c-47f0-a4c1-18e1297fa81d","Type":"ContainerStarted","Data":"198fc93efd6dbc2346260a021b2acdc8c5be356cd2873df9dfe61cc13f18bd00"} Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.875114 4926 generic.go:334] "Generic (PLEG): container finished" podID="4cb8f27f-336b-4acc-8165-0c53c1643084" containerID="6afc876661934d69ed6bb90885f853f8b5249b4339a13cab5b59071dc578e401" exitCode=0 Mar 12 18:24:20 crc kubenswrapper[4926]: I0312 18:24:20.875166 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-55g54" event={"ID":"4cb8f27f-336b-4acc-8165-0c53c1643084","Type":"ContainerDied","Data":"6afc876661934d69ed6bb90885f853f8b5249b4339a13cab5b59071dc578e401"} Mar 12 18:24:21 crc kubenswrapper[4926]: I0312 18:24:21.970882 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:24:21 crc kubenswrapper[4926]: I0312 18:24:21.971936 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fdd856968-nmxxn" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.089247 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fc994c476-fv9c9"] Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.089778 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fc994c476-fv9c9" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-log" containerID="cri-o://cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a" gracePeriod=30 Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.090713 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fc994c476-fv9c9" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-api" containerID="cri-o://e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73" gracePeriod=30 Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.316283 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.438080 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxrpx\" (UniqueName: \"kubernetes.io/projected/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-kube-api-access-mxrpx\") pod \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.438304 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-operator-scripts\") pod \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\" (UID: \"dc4b3e47-ab9e-46c0-be37-9d2e94721be6\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.439838 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc4b3e47-ab9e-46c0-be37-9d2e94721be6" (UID: "dc4b3e47-ab9e-46c0-be37-9d2e94721be6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.444120 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-kube-api-access-mxrpx" (OuterVolumeSpecName: "kube-api-access-mxrpx") pod "dc4b3e47-ab9e-46c0-be37-9d2e94721be6" (UID: "dc4b3e47-ab9e-46c0-be37-9d2e94721be6"). InnerVolumeSpecName "kube-api-access-mxrpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.542585 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxrpx\" (UniqueName: \"kubernetes.io/projected/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-kube-api-access-mxrpx\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.542621 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc4b3e47-ab9e-46c0-be37-9d2e94721be6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.553176 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.566734 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.585250 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.604290 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.620059 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646538 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm9pf\" (UniqueName: \"kubernetes.io/projected/4cb8f27f-336b-4acc-8165-0c53c1643084-kube-api-access-jm9pf\") pod \"4cb8f27f-336b-4acc-8165-0c53c1643084\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646621 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rll4\" (UniqueName: \"kubernetes.io/projected/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-kube-api-access-5rll4\") pod \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646665 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjlt\" (UniqueName: \"kubernetes.io/projected/15db9185-0441-431d-98db-5701f9b244be-kube-api-access-gbjlt\") pod \"15db9185-0441-431d-98db-5701f9b244be\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646727 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15db9185-0441-431d-98db-5701f9b244be-operator-scripts\") pod \"15db9185-0441-431d-98db-5701f9b244be\" (UID: \"15db9185-0441-431d-98db-5701f9b244be\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646757 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjfpj\" (UniqueName: \"kubernetes.io/projected/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-kube-api-access-hjfpj\") pod \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646787 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb8f27f-336b-4acc-8165-0c53c1643084-operator-scripts\") pod \"4cb8f27f-336b-4acc-8165-0c53c1643084\" (UID: \"4cb8f27f-336b-4acc-8165-0c53c1643084\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646835 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-operator-scripts\") pod \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\" (UID: \"fe0c0d35-e463-4b5c-a15f-cdfc808c498c\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.646867 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-operator-scripts\") pod \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\" (UID: \"b33ff126-1b0c-47f0-a4c1-18e1297fa81d\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.647670 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b33ff126-1b0c-47f0-a4c1-18e1297fa81d" (UID: "b33ff126-1b0c-47f0-a4c1-18e1297fa81d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.648425 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15db9185-0441-431d-98db-5701f9b244be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15db9185-0441-431d-98db-5701f9b244be" (UID: "15db9185-0441-431d-98db-5701f9b244be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.650929 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb8f27f-336b-4acc-8165-0c53c1643084-kube-api-access-jm9pf" (OuterVolumeSpecName: "kube-api-access-jm9pf") pod "4cb8f27f-336b-4acc-8165-0c53c1643084" (UID: "4cb8f27f-336b-4acc-8165-0c53c1643084"). InnerVolumeSpecName "kube-api-access-jm9pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.651692 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-kube-api-access-5rll4" (OuterVolumeSpecName: "kube-api-access-5rll4") pod "b33ff126-1b0c-47f0-a4c1-18e1297fa81d" (UID: "b33ff126-1b0c-47f0-a4c1-18e1297fa81d"). InnerVolumeSpecName "kube-api-access-5rll4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.654587 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15db9185-0441-431d-98db-5701f9b244be-kube-api-access-gbjlt" (OuterVolumeSpecName: "kube-api-access-gbjlt") pod "15db9185-0441-431d-98db-5701f9b244be" (UID: "15db9185-0441-431d-98db-5701f9b244be"). InnerVolumeSpecName "kube-api-access-gbjlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.655363 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cb8f27f-336b-4acc-8165-0c53c1643084-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cb8f27f-336b-4acc-8165-0c53c1643084" (UID: "4cb8f27f-336b-4acc-8165-0c53c1643084"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.655722 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe0c0d35-e463-4b5c-a15f-cdfc808c498c" (UID: "fe0c0d35-e463-4b5c-a15f-cdfc808c498c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.665620 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-kube-api-access-hjfpj" (OuterVolumeSpecName: "kube-api-access-hjfpj") pod "fe0c0d35-e463-4b5c-a15f-cdfc808c498c" (UID: "fe0c0d35-e463-4b5c-a15f-cdfc808c498c"). InnerVolumeSpecName "kube-api-access-hjfpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.748686 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a2618b-1e0a-4246-b63c-582d2fe2f847-operator-scripts\") pod \"29a2618b-1e0a-4246-b63c-582d2fe2f847\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.748917 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8qwj\" (UniqueName: \"kubernetes.io/projected/29a2618b-1e0a-4246-b63c-582d2fe2f847-kube-api-access-p8qwj\") pod \"29a2618b-1e0a-4246-b63c-582d2fe2f847\" (UID: \"29a2618b-1e0a-4246-b63c-582d2fe2f847\") " Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749391 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm9pf\" (UniqueName: \"kubernetes.io/projected/4cb8f27f-336b-4acc-8165-0c53c1643084-kube-api-access-jm9pf\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749416 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rll4\" (UniqueName: \"kubernetes.io/projected/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-kube-api-access-5rll4\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749430 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjlt\" (UniqueName: \"kubernetes.io/projected/15db9185-0441-431d-98db-5701f9b244be-kube-api-access-gbjlt\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749461 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15db9185-0441-431d-98db-5701f9b244be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749472 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjfpj\" (UniqueName: \"kubernetes.io/projected/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-kube-api-access-hjfpj\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749485 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cb8f27f-336b-4acc-8165-0c53c1643084-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749497 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe0c0d35-e463-4b5c-a15f-cdfc808c498c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.749510 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b33ff126-1b0c-47f0-a4c1-18e1297fa81d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.750798 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2618b-1e0a-4246-b63c-582d2fe2f847-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29a2618b-1e0a-4246-b63c-582d2fe2f847" (UID: "29a2618b-1e0a-4246-b63c-582d2fe2f847"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.756635 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a2618b-1e0a-4246-b63c-582d2fe2f847-kube-api-access-p8qwj" (OuterVolumeSpecName: "kube-api-access-p8qwj") pod "29a2618b-1e0a-4246-b63c-582d2fe2f847" (UID: "29a2618b-1e0a-4246-b63c-582d2fe2f847"). InnerVolumeSpecName "kube-api-access-p8qwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.851112 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8qwj\" (UniqueName: \"kubernetes.io/projected/29a2618b-1e0a-4246-b63c-582d2fe2f847-kube-api-access-p8qwj\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.851341 4926 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29a2618b-1e0a-4246-b63c-582d2fe2f847-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.907029 4926 generic.go:334] "Generic (PLEG): container finished" podID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerID="cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a" exitCode=143 Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.907106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc994c476-fv9c9" event={"ID":"450e1ecf-5ae7-48b5-b567-e530e254f673","Type":"ContainerDied","Data":"cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.910141 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerStarted","Data":"d0d68ef12f3c48e2e671f3c6b6e0ef4adbaa0a1069ddce15c7818c9465fd1365"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.910376 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.914559 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-382f-account-create-update-7sn9z" event={"ID":"fe0c0d35-e463-4b5c-a15f-cdfc808c498c","Type":"ContainerDied","Data":"b1d04c89d3d9fb6caf62bdb2b456380c91864ceab753f46a51bad242a32a7521"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.914604 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d04c89d3d9fb6caf62bdb2b456380c91864ceab753f46a51bad242a32a7521" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.914682 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-382f-account-create-update-7sn9z" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.920857 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" event={"ID":"b33ff126-1b0c-47f0-a4c1-18e1297fa81d","Type":"ContainerDied","Data":"198fc93efd6dbc2346260a021b2acdc8c5be356cd2873df9dfe61cc13f18bd00"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.920895 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="198fc93efd6dbc2346260a021b2acdc8c5be356cd2873df9dfe61cc13f18bd00" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.920878 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f45b-account-create-update-rt9j9" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.922687 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-55g54" event={"ID":"4cb8f27f-336b-4acc-8165-0c53c1643084","Type":"ContainerDied","Data":"c6f0363211c7d4e74e3ccbba5ec4240749e8ebe5592c2510b2fa60c1f08ec3c9"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.922710 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-55g54" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.922724 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f0363211c7d4e74e3ccbba5ec4240749e8ebe5592c2510b2fa60c1f08ec3c9" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.927044 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0630-account-create-update-knx9n" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.927050 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0630-account-create-update-knx9n" event={"ID":"29a2618b-1e0a-4246-b63c-582d2fe2f847","Type":"ContainerDied","Data":"ff34501c9af340574663fc5763d61f95a310be34a280df0b21b236386141ee47"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.927766 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff34501c9af340574663fc5763d61f95a310be34a280df0b21b236386141ee47" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.936822 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.501118291 podStartE2EDuration="6.936805548s" podCreationTimestamp="2026-03-12 18:24:16 +0000 UTC" firstStartedPulling="2026-03-12 18:24:17.588843105 +0000 UTC m=+1297.957469438" lastFinishedPulling="2026-03-12 18:24:22.024530362 +0000 UTC m=+1302.393156695" observedRunningTime="2026-03-12 18:24:22.932904356 +0000 UTC m=+1303.301530689" watchObservedRunningTime="2026-03-12 18:24:22.936805548 +0000 UTC m=+1303.305431871" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.938229 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cctxg" event={"ID":"dc4b3e47-ab9e-46c0-be37-9d2e94721be6","Type":"ContainerDied","Data":"e0b512c7b0ec080f65f1af63e533a0195e053d720c63ac72bbcdd0918ebd9be0"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.938269 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0b512c7b0ec080f65f1af63e533a0195e053d720c63ac72bbcdd0918ebd9be0" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.938285 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cctxg" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.943275 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4krn" Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.943596 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t4krn" event={"ID":"15db9185-0441-431d-98db-5701f9b244be","Type":"ContainerDied","Data":"6c9449dc088a16c50ac498f087e95dd4074e327717458b2a2dca40a15417614a"} Mar 12 18:24:22 crc kubenswrapper[4926]: I0312 18:24:22.943720 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c9449dc088a16c50ac498f087e95dd4074e327717458b2a2dca40a15417614a" Mar 12 18:24:23 crc kubenswrapper[4926]: I0312 18:24:23.935958 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:24:23 crc kubenswrapper[4926]: I0312 18:24:23.940724 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-httpd" containerID="cri-o://1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178" gracePeriod=30 Mar 12 18:24:23 crc kubenswrapper[4926]: I0312 18:24:23.940682 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-log" containerID="cri-o://9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c" gracePeriod=30 Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159157 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cbqv8"] Mar 12 18:24:24 crc kubenswrapper[4926]: E0312 18:24:24.159654 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb8f27f-336b-4acc-8165-0c53c1643084" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159680 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb8f27f-336b-4acc-8165-0c53c1643084" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: E0312 18:24:24.159698 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15db9185-0441-431d-98db-5701f9b244be" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159707 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="15db9185-0441-431d-98db-5701f9b244be" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: E0312 18:24:24.159720 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b33ff126-1b0c-47f0-a4c1-18e1297fa81d" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159728 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b33ff126-1b0c-47f0-a4c1-18e1297fa81d" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: E0312 18:24:24.159739 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0c0d35-e463-4b5c-a15f-cdfc808c498c" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159747 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0c0d35-e463-4b5c-a15f-cdfc808c498c" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: E0312 18:24:24.159774 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2618b-1e0a-4246-b63c-582d2fe2f847" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159782 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2618b-1e0a-4246-b63c-582d2fe2f847" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: E0312 18:24:24.159810 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4b3e47-ab9e-46c0-be37-9d2e94721be6" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.159818 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4b3e47-ab9e-46c0-be37-9d2e94721be6" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.160030 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb8f27f-336b-4acc-8165-0c53c1643084" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.160046 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b33ff126-1b0c-47f0-a4c1-18e1297fa81d" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.160060 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a2618b-1e0a-4246-b63c-582d2fe2f847" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.160077 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="15db9185-0441-431d-98db-5701f9b244be" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.160099 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0c0d35-e463-4b5c-a15f-cdfc808c498c" containerName="mariadb-account-create-update" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.160114 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4b3e47-ab9e-46c0-be37-9d2e94721be6" containerName="mariadb-database-create" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.162506 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.171269 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.171710 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.171993 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-89pqc" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.186049 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cbqv8"] Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.275824 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-config-data\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.276497 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-scripts\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.276598 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbq59\" (UniqueName: \"kubernetes.io/projected/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-kube-api-access-fbq59\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.276688 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.378347 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbq59\" (UniqueName: \"kubernetes.io/projected/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-kube-api-access-fbq59\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.378615 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.378717 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-config-data\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.378909 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-scripts\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.384430 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-config-data\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.384992 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-scripts\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.390490 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.415109 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbq59\" (UniqueName: \"kubernetes.io/projected/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-kube-api-access-fbq59\") pod \"nova-cell0-conductor-db-sync-cbqv8\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.481060 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.804104 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cbqv8"] Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.983678 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" event={"ID":"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57","Type":"ContainerStarted","Data":"f726c742045749f4782471f8e9d8e4c6886b24fac2346f0142597eae7e927832"} Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.985554 4926 generic.go:334] "Generic (PLEG): container finished" podID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerID="9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c" exitCode=143 Mar 12 18:24:24 crc kubenswrapper[4926]: I0312 18:24:24.985590 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"27561f0e-1da4-4313-a7df-544fdfc893b1","Type":"ContainerDied","Data":"9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c"} Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.705464 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.725652 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-public-tls-certs\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.726054 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-scripts\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.726094 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2nd\" (UniqueName: \"kubernetes.io/projected/450e1ecf-5ae7-48b5-b567-e530e254f673-kube-api-access-vh2nd\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.726192 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/450e1ecf-5ae7-48b5-b567-e530e254f673-logs\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.726236 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-config-data\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.726322 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-internal-tls-certs\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.726343 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-combined-ca-bundle\") pod \"450e1ecf-5ae7-48b5-b567-e530e254f673\" (UID: \"450e1ecf-5ae7-48b5-b567-e530e254f673\") " Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.728124 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/450e1ecf-5ae7-48b5-b567-e530e254f673-logs" (OuterVolumeSpecName: "logs") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.737130 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-scripts" (OuterVolumeSpecName: "scripts") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.745638 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/450e1ecf-5ae7-48b5-b567-e530e254f673-kube-api-access-vh2nd" (OuterVolumeSpecName: "kube-api-access-vh2nd") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "kube-api-access-vh2nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.805957 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-config-data" (OuterVolumeSpecName: "config-data") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.829402 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/450e1ecf-5ae7-48b5-b567-e530e254f673-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.829449 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.829462 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.829475 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2nd\" (UniqueName: \"kubernetes.io/projected/450e1ecf-5ae7-48b5-b567-e530e254f673-kube-api-access-vh2nd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.839507 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.856581 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.870081 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "450e1ecf-5ae7-48b5-b567-e530e254f673" (UID: "450e1ecf-5ae7-48b5-b567-e530e254f673"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.930933 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.930998 4926 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:25 crc kubenswrapper[4926]: I0312 18:24:25.931011 4926 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/450e1ecf-5ae7-48b5-b567-e530e254f673-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.007999 4926 generic.go:334] "Generic (PLEG): container finished" podID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerID="e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73" exitCode=0 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.008079 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc994c476-fv9c9" event={"ID":"450e1ecf-5ae7-48b5-b567-e530e254f673","Type":"ContainerDied","Data":"e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73"} Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.008115 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fc994c476-fv9c9" event={"ID":"450e1ecf-5ae7-48b5-b567-e530e254f673","Type":"ContainerDied","Data":"1f1ee7c53345b2a1183a515645bed6bfd508f8efad3f3b4a2588d4c7787c2e09"} Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.008387 4926 scope.go:117] "RemoveContainer" containerID="e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.008608 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fc994c476-fv9c9" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.051702 4926 scope.go:117] "RemoveContainer" containerID="cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.053464 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fc994c476-fv9c9"] Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.063409 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6fc994c476-fv9c9"] Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.073818 4926 scope.go:117] "RemoveContainer" containerID="e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73" Mar 12 18:24:26 crc kubenswrapper[4926]: E0312 18:24:26.074189 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73\": container with ID starting with e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73 not found: ID does not exist" containerID="e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.074231 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73"} err="failed to get container status \"e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73\": rpc error: code = NotFound desc = could not find container \"e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73\": container with ID starting with e87064db3c4dab601e171b8ef221f01b14584cedfda748a9cbf2db9f357b6a73 not found: ID does not exist" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.074256 4926 scope.go:117] "RemoveContainer" containerID="cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a" Mar 12 18:24:26 crc kubenswrapper[4926]: E0312 18:24:26.077991 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a\": container with ID starting with cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a not found: ID does not exist" containerID="cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.078019 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a"} err="failed to get container status \"cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a\": rpc error: code = NotFound desc = could not find container \"cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a\": container with ID starting with cafbe824f3deda5f6c1263207977f209e8ce8e8b6bb8c642a6b34e03c9ace92a not found: ID does not exist" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.482960 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.488127 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-httpd" containerID="cri-o://73938b8cf4a571a4406931a492f2e033c549c10b7a2bcdf0ca6b902ede59649d" gracePeriod=30 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.491503 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-log" containerID="cri-o://809f7347552b8f928619d96194c02f26e8cc98b9b07e8efde999fc4f99c7437f" gracePeriod=30 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.513729 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" path="/var/lib/kubelet/pods/450e1ecf-5ae7-48b5-b567-e530e254f673/volumes" Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.560305 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.560570 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-central-agent" containerID="cri-o://70089f60f5708dc42a17c61910ff76a140c92fff334f5b497df2b30ef8db681d" gracePeriod=30 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.560952 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="proxy-httpd" containerID="cri-o://d0d68ef12f3c48e2e671f3c6b6e0ef4adbaa0a1069ddce15c7818c9465fd1365" gracePeriod=30 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.561003 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="sg-core" containerID="cri-o://ef9ed90f72e8dfcc270f36331c95e42d9aa6057e9d8ae052ebb5501d32f7648b" gracePeriod=30 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.561035 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-notification-agent" containerID="cri-o://309dce8d3efcb58229ff7be17feec841c39fb231db3ee8c9eaa9375400b8f496" gracePeriod=30 Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.817349 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:24:26 crc kubenswrapper[4926]: I0312 18:24:26.817450 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.026137 4926 generic.go:334] "Generic (PLEG): container finished" podID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerID="d0d68ef12f3c48e2e671f3c6b6e0ef4adbaa0a1069ddce15c7818c9465fd1365" exitCode=0 Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.026173 4926 generic.go:334] "Generic (PLEG): container finished" podID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerID="ef9ed90f72e8dfcc270f36331c95e42d9aa6057e9d8ae052ebb5501d32f7648b" exitCode=2 Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.026181 4926 generic.go:334] "Generic (PLEG): container finished" podID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerID="309dce8d3efcb58229ff7be17feec841c39fb231db3ee8c9eaa9375400b8f496" exitCode=0 Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.026239 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerDied","Data":"d0d68ef12f3c48e2e671f3c6b6e0ef4adbaa0a1069ddce15c7818c9465fd1365"} Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.026292 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerDied","Data":"ef9ed90f72e8dfcc270f36331c95e42d9aa6057e9d8ae052ebb5501d32f7648b"} Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.026314 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerDied","Data":"309dce8d3efcb58229ff7be17feec841c39fb231db3ee8c9eaa9375400b8f496"} Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.028874 4926 generic.go:334] "Generic (PLEG): container finished" podID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerID="809f7347552b8f928619d96194c02f26e8cc98b9b07e8efde999fc4f99c7437f" exitCode=143 Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.028922 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a","Type":"ContainerDied","Data":"809f7347552b8f928619d96194c02f26e8cc98b9b07e8efde999fc4f99c7437f"} Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.635066 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.760734 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.760796 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-scripts\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.760891 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v2m5\" (UniqueName: \"kubernetes.io/projected/27561f0e-1da4-4313-a7df-544fdfc893b1-kube-api-access-9v2m5\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.760960 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-logs\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.761588 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-public-tls-certs\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.761622 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-httpd-run\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.761637 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-combined-ca-bundle\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.761719 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-config-data\") pod \"27561f0e-1da4-4313-a7df-544fdfc893b1\" (UID: \"27561f0e-1da4-4313-a7df-544fdfc893b1\") " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.761866 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.761934 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-logs" (OuterVolumeSpecName: "logs") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.762484 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.762499 4926 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27561f0e-1da4-4313-a7df-544fdfc893b1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.767020 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-scripts" (OuterVolumeSpecName: "scripts") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.768577 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.789895 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27561f0e-1da4-4313-a7df-544fdfc893b1-kube-api-access-9v2m5" (OuterVolumeSpecName: "kube-api-access-9v2m5") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "kube-api-access-9v2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.805667 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.831879 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-config-data" (OuterVolumeSpecName: "config-data") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.842988 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "27561f0e-1da4-4313-a7df-544fdfc893b1" (UID: "27561f0e-1da4-4313-a7df-544fdfc893b1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.864209 4926 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.864254 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.864273 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v2m5\" (UniqueName: \"kubernetes.io/projected/27561f0e-1da4-4313-a7df-544fdfc893b1-kube-api-access-9v2m5\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.864289 4926 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.864303 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.864315 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27561f0e-1da4-4313-a7df-544fdfc893b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.895522 4926 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 12 18:24:27 crc kubenswrapper[4926]: I0312 18:24:27.965564 4926 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.041248 4926 generic.go:334] "Generic (PLEG): container finished" podID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerID="1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178" exitCode=0 Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.041319 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"27561f0e-1da4-4313-a7df-544fdfc893b1","Type":"ContainerDied","Data":"1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178"} Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.041361 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"27561f0e-1da4-4313-a7df-544fdfc893b1","Type":"ContainerDied","Data":"600cdc83756bdbe2006c6d01ba6698b38db80c94206722c7e9c45bd9b1332b71"} Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.041386 4926 scope.go:117] "RemoveContainer" containerID="1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.041594 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.083745 4926 scope.go:117] "RemoveContainer" containerID="9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.092160 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.111920 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.126339 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:24:28 crc kubenswrapper[4926]: E0312 18:24:28.129231 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-api" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.129289 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-api" Mar 12 18:24:28 crc kubenswrapper[4926]: E0312 18:24:28.129524 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-log" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.129538 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-log" Mar 12 18:24:28 crc kubenswrapper[4926]: E0312 18:24:28.129567 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-httpd" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.129761 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-httpd" Mar 12 18:24:28 crc kubenswrapper[4926]: E0312 18:24:28.129787 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-log" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.129796 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-log" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.132020 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-log" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.132050 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="450e1ecf-5ae7-48b5-b567-e530e254f673" containerName="placement-api" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.132072 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-log" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.132102 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" containerName="glance-httpd" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.133205 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.138173 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.139652 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.141229 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.177725 4926 scope.go:117] "RemoveContainer" containerID="1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178" Mar 12 18:24:28 crc kubenswrapper[4926]: E0312 18:24:28.179868 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178\": container with ID starting with 1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178 not found: ID does not exist" containerID="1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.179918 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178"} err="failed to get container status \"1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178\": rpc error: code = NotFound desc = could not find container \"1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178\": container with ID starting with 1fa3d22484b03763d8133f2d587a25b5e1ffc1e16aa1a1f610d78f1e559cc178 not found: ID does not exist" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.179964 4926 scope.go:117] "RemoveContainer" containerID="9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c" Mar 12 18:24:28 crc kubenswrapper[4926]: E0312 18:24:28.181846 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c\": container with ID starting with 9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c not found: ID does not exist" containerID="9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.181914 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c"} err="failed to get container status \"9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c\": rpc error: code = NotFound desc = could not find container \"9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c\": container with ID starting with 9c7305029bac56d73a118acb64bb658e6027d0c4477cedae842b84b08298a73c not found: ID does not exist" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269703 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lxj\" (UniqueName: \"kubernetes.io/projected/0749ce08-7fa5-48fe-9248-ac3a6699ef57-kube-api-access-z5lxj\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269751 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-scripts\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269788 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0749ce08-7fa5-48fe-9248-ac3a6699ef57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269852 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-config-data\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269894 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269916 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269938 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.269970 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0749ce08-7fa5-48fe-9248-ac3a6699ef57-logs\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.371893 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lxj\" (UniqueName: \"kubernetes.io/projected/0749ce08-7fa5-48fe-9248-ac3a6699ef57-kube-api-access-z5lxj\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.371938 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-scripts\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.371964 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0749ce08-7fa5-48fe-9248-ac3a6699ef57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372016 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-config-data\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372053 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372072 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372089 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372116 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0749ce08-7fa5-48fe-9248-ac3a6699ef57-logs\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372686 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0749ce08-7fa5-48fe-9248-ac3a6699ef57-logs\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372828 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0749ce08-7fa5-48fe-9248-ac3a6699ef57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.372927 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.377972 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.378370 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.378844 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-scripts\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.379350 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0749ce08-7fa5-48fe-9248-ac3a6699ef57-config-data\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.399422 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lxj\" (UniqueName: \"kubernetes.io/projected/0749ce08-7fa5-48fe-9248-ac3a6699ef57-kube-api-access-z5lxj\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.421515 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"0749ce08-7fa5-48fe-9248-ac3a6699ef57\") " pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.500422 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 18:24:28 crc kubenswrapper[4926]: I0312 18:24:28.501286 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27561f0e-1da4-4313-a7df-544fdfc893b1" path="/var/lib/kubelet/pods/27561f0e-1da4-4313-a7df-544fdfc893b1/volumes" Mar 12 18:24:29 crc kubenswrapper[4926]: I0312 18:24:29.087982 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 18:24:29 crc kubenswrapper[4926]: I0312 18:24:29.643526 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:53948->10.217.0.154:9292: read: connection reset by peer" Mar 12 18:24:29 crc kubenswrapper[4926]: I0312 18:24:29.644405 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9292/healthcheck\": read tcp 10.217.0.2:53946->10.217.0.154:9292: read: connection reset by peer" Mar 12 18:24:30 crc kubenswrapper[4926]: I0312 18:24:30.069215 4926 generic.go:334] "Generic (PLEG): container finished" podID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerID="70089f60f5708dc42a17c61910ff76a140c92fff334f5b497df2b30ef8db681d" exitCode=0 Mar 12 18:24:30 crc kubenswrapper[4926]: I0312 18:24:30.069271 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerDied","Data":"70089f60f5708dc42a17c61910ff76a140c92fff334f5b497df2b30ef8db681d"} Mar 12 18:24:30 crc kubenswrapper[4926]: I0312 18:24:30.071045 4926 generic.go:334] "Generic (PLEG): container finished" podID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerID="73938b8cf4a571a4406931a492f2e033c549c10b7a2bcdf0ca6b902ede59649d" exitCode=0 Mar 12 18:24:30 crc kubenswrapper[4926]: I0312 18:24:30.071072 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a","Type":"ContainerDied","Data":"73938b8cf4a571a4406931a492f2e033c549c10b7a2bcdf0ca6b902ede59649d"} Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.108824 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0749ce08-7fa5-48fe-9248-ac3a6699ef57","Type":"ContainerStarted","Data":"a4a12cabad0ea532415d393ea0ffd34bbf6b2b5705b28bea1f5b26dd3d7804de"} Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.442028 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.486632 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505106 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-httpd-run\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505164 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-config-data\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505213 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-scripts\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505237 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-scripts\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505260 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-config-data\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505353 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-run-httpd\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505382 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-log-httpd\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505407 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qgwd\" (UniqueName: \"kubernetes.io/projected/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-kube-api-access-7qgwd\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505430 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-internal-tls-certs\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505856 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505883 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505887 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-combined-ca-bundle\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.505983 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-combined-ca-bundle\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506164 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506336 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b9v4\" (UniqueName: \"kubernetes.io/projected/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-kube-api-access-7b9v4\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506387 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-logs\") pod \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\" (UID: \"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506416 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-sg-core-conf-yaml\") pod \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\" (UID: \"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea\") " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506479 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506910 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-logs" (OuterVolumeSpecName: "logs") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506976 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.506995 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.507004 4926 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.509860 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-kube-api-access-7qgwd" (OuterVolumeSpecName: "kube-api-access-7qgwd") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "kube-api-access-7qgwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.511261 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.539251 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-scripts" (OuterVolumeSpecName: "scripts") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.539271 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-scripts" (OuterVolumeSpecName: "scripts") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.540329 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-kube-api-access-7b9v4" (OuterVolumeSpecName: "kube-api-access-7b9v4") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "kube-api-access-7b9v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.567264 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.583744 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.594241 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608614 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b9v4\" (UniqueName: \"kubernetes.io/projected/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-kube-api-access-7b9v4\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608644 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608656 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608663 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608671 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608679 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qgwd\" (UniqueName: \"kubernetes.io/projected/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-kube-api-access-7qgwd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608687 4926 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608712 4926 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.608721 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.614094 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-config-data" (OuterVolumeSpecName: "config-data") pod "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" (UID: "f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.628062 4926 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.633404 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-config-data" (OuterVolumeSpecName: "config-data") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.673192 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" (UID: "2e50e6cb-b07b-4a13-8b5c-e889e53c49ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.709804 4926 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.710048 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.710060 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:34 crc kubenswrapper[4926]: I0312 18:24:34.710070 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.125178 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e50e6cb-b07b-4a13-8b5c-e889e53c49ea","Type":"ContainerDied","Data":"7996e6fe65ac453b2bf7a0036550739c5607aa7bde6261bcc4407d93c97a4ca7"} Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.125233 4926 scope.go:117] "RemoveContainer" containerID="d0d68ef12f3c48e2e671f3c6b6e0ef4adbaa0a1069ddce15c7818c9465fd1365" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.125285 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.132363 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.132480 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a","Type":"ContainerDied","Data":"ac524b6af20000079258195bab5b0ef6d7a0ae5f18a8830578a4f4e1b53a9c85"} Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.136239 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0749ce08-7fa5-48fe-9248-ac3a6699ef57","Type":"ContainerStarted","Data":"157de7d9a24e5910cf8461e4d788dc27f9a81db1355cf4f0c8d758eeeec9fd24"} Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.138364 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" event={"ID":"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57","Type":"ContainerStarted","Data":"683758cbf77fef46bca458acf7d282b0867f1aa1a065fbc9925548f2b1c73316"} Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.173099 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" podStartSLOduration=1.920938977 podStartE2EDuration="11.173077489s" podCreationTimestamp="2026-03-12 18:24:24 +0000 UTC" firstStartedPulling="2026-03-12 18:24:24.820245218 +0000 UTC m=+1305.188871551" lastFinishedPulling="2026-03-12 18:24:34.07238373 +0000 UTC m=+1314.441010063" observedRunningTime="2026-03-12 18:24:35.157710336 +0000 UTC m=+1315.526336689" watchObservedRunningTime="2026-03-12 18:24:35.173077489 +0000 UTC m=+1315.541703822" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.180579 4926 scope.go:117] "RemoveContainer" containerID="ef9ed90f72e8dfcc270f36331c95e42d9aa6057e9d8ae052ebb5501d32f7648b" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.203399 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.212735 4926 scope.go:117] "RemoveContainer" containerID="309dce8d3efcb58229ff7be17feec841c39fb231db3ee8c9eaa9375400b8f496" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.221890 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.249938 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.255969 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: E0312 18:24:35.256806 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-log" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.256832 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-log" Mar 12 18:24:35 crc kubenswrapper[4926]: E0312 18:24:35.256849 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="sg-core" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.256858 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="sg-core" Mar 12 18:24:35 crc kubenswrapper[4926]: E0312 18:24:35.256882 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-httpd" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.256892 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-httpd" Mar 12 18:24:35 crc kubenswrapper[4926]: E0312 18:24:35.256908 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-central-agent" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.256915 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-central-agent" Mar 12 18:24:35 crc kubenswrapper[4926]: E0312 18:24:35.256934 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-notification-agent" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.256942 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-notification-agent" Mar 12 18:24:35 crc kubenswrapper[4926]: E0312 18:24:35.256956 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="proxy-httpd" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.256963 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="proxy-httpd" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.257158 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-log" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.257183 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-central-agent" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.257202 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="proxy-httpd" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.257216 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="sg-core" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.257229 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" containerName="ceilometer-notification-agent" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.257240 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" containerName="glance-httpd" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.258481 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.263170 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.263243 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.263318 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.264550 4926 scope.go:117] "RemoveContainer" containerID="70089f60f5708dc42a17c61910ff76a140c92fff334f5b497df2b30ef8db681d" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.284556 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.292266 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.294246 4926 scope.go:117] "RemoveContainer" containerID="73938b8cf4a571a4406931a492f2e033c549c10b7a2bcdf0ca6b902ede59649d" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.294702 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.298773 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.313300 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.337531 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.340035 4926 scope.go:117] "RemoveContainer" containerID="809f7347552b8f928619d96194c02f26e8cc98b9b07e8efde999fc4f99c7437f" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428311 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428365 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-run-httpd\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428416 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428493 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqxnc\" (UniqueName: \"kubernetes.io/projected/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-kube-api-access-bqxnc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428529 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428554 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428582 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-log-httpd\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428620 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428640 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-config-data\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428661 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428696 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428731 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.428852 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.429003 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7gn\" (UniqueName: \"kubernetes.io/projected/f48005ff-c364-47aa-8253-9a3124097a10-kube-api-access-lq7gn\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.429045 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-scripts\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530558 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530610 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530652 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7gn\" (UniqueName: \"kubernetes.io/projected/f48005ff-c364-47aa-8253-9a3124097a10-kube-api-access-lq7gn\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530673 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-scripts\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530702 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530717 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-run-httpd\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530748 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530783 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqxnc\" (UniqueName: \"kubernetes.io/projected/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-kube-api-access-bqxnc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530804 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530821 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530844 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-log-httpd\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530869 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530886 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-config-data\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530900 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.530924 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.531370 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.532633 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-log-httpd\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.532664 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-run-httpd\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.532750 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.532944 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.537109 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.537148 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.537301 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.537954 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-scripts\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.546178 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.548566 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.550043 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.550385 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-config-data\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.553331 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7gn\" (UniqueName: \"kubernetes.io/projected/f48005ff-c364-47aa-8253-9a3124097a10-kube-api-access-lq7gn\") pod \"ceilometer-0\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.554127 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqxnc\" (UniqueName: \"kubernetes.io/projected/1e75a0d1-4272-43f3-b1b8-3cfe57e0141d-kube-api-access-bqxnc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.579466 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d\") " pod="openstack/glance-default-internal-api-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.642223 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:35 crc kubenswrapper[4926]: I0312 18:24:35.877277 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.072428 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:36 crc kubenswrapper[4926]: W0312 18:24:36.075641 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf48005ff_c364_47aa_8253_9a3124097a10.slice/crio-f451adeb6303d8745b2e50f6138760aa3de2cccdedbd3b8de83c0f3d99a02905 WatchSource:0}: Error finding container f451adeb6303d8745b2e50f6138760aa3de2cccdedbd3b8de83c0f3d99a02905: Status 404 returned error can't find the container with id f451adeb6303d8745b2e50f6138760aa3de2cccdedbd3b8de83c0f3d99a02905 Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.149494 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0749ce08-7fa5-48fe-9248-ac3a6699ef57","Type":"ContainerStarted","Data":"0422041cdf5f4f73ab75457856395019cfba45f8d3b4cb8fec90717ce08223b5"} Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.152407 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerStarted","Data":"f451adeb6303d8745b2e50f6138760aa3de2cccdedbd3b8de83c0f3d99a02905"} Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.181361 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.181342816 podStartE2EDuration="8.181342816s" podCreationTimestamp="2026-03-12 18:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:24:36.169146884 +0000 UTC m=+1316.537773247" watchObservedRunningTime="2026-03-12 18:24:36.181342816 +0000 UTC m=+1316.549969149" Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.366532 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 18:24:36 crc kubenswrapper[4926]: W0312 18:24:36.367695 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e75a0d1_4272_43f3_b1b8_3cfe57e0141d.slice/crio-5ccda701c4d814f87501797d7fb8c8b92ceeefdf90d8b6cef3192f73df6c015a WatchSource:0}: Error finding container 5ccda701c4d814f87501797d7fb8c8b92ceeefdf90d8b6cef3192f73df6c015a: Status 404 returned error can't find the container with id 5ccda701c4d814f87501797d7fb8c8b92ceeefdf90d8b6cef3192f73df6c015a Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.514349 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e50e6cb-b07b-4a13-8b5c-e889e53c49ea" path="/var/lib/kubelet/pods/2e50e6cb-b07b-4a13-8b5c-e889e53c49ea/volumes" Mar 12 18:24:36 crc kubenswrapper[4926]: I0312 18:24:36.515715 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a" path="/var/lib/kubelet/pods/f3945fa8-a1cc-4c4b-91c6-2cb5a6567f8a/volumes" Mar 12 18:24:37 crc kubenswrapper[4926]: I0312 18:24:37.167668 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerStarted","Data":"435ef2132df5554295889ec9ba808434afc58d13077a9a39e7222dd605149d55"} Mar 12 18:24:37 crc kubenswrapper[4926]: I0312 18:24:37.171164 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d","Type":"ContainerStarted","Data":"e57f630ce30cab4c95cd232834974e08eb8649298261ded7bcb7b40c4db45ae8"} Mar 12 18:24:37 crc kubenswrapper[4926]: I0312 18:24:37.171196 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d","Type":"ContainerStarted","Data":"5ccda701c4d814f87501797d7fb8c8b92ceeefdf90d8b6cef3192f73df6c015a"} Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.183118 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerStarted","Data":"0b78f1753e8c4a22ed2f4147f06789b56f6db668b7a92626590701ea109bcd09"} Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.185934 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e75a0d1-4272-43f3-b1b8-3cfe57e0141d","Type":"ContainerStarted","Data":"42e78e74e76a93045e02e79927bae762527f964a9237f343e0e05360197b5f00"} Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.212120 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.212099439 podStartE2EDuration="3.212099439s" podCreationTimestamp="2026-03-12 18:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:24:38.206879486 +0000 UTC m=+1318.575505819" watchObservedRunningTime="2026-03-12 18:24:38.212099439 +0000 UTC m=+1318.580725782" Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.503728 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.504086 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.549702 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 18:24:38 crc kubenswrapper[4926]: I0312 18:24:38.553690 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 18:24:39 crc kubenswrapper[4926]: I0312 18:24:39.197891 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerStarted","Data":"7655ace271d6f3feff7f2f4eaa16c5ea625e5fe6beab0c4038907b7f5e715987"} Mar 12 18:24:39 crc kubenswrapper[4926]: I0312 18:24:39.198885 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 18:24:39 crc kubenswrapper[4926]: I0312 18:24:39.198964 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 18:24:39 crc kubenswrapper[4926]: I0312 18:24:39.425808 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.222479 4926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.222565 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-central-agent" containerID="cri-o://435ef2132df5554295889ec9ba808434afc58d13077a9a39e7222dd605149d55" gracePeriod=30 Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.222601 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerStarted","Data":"89db61a8e375fc043c0de0eeea971f98c1fb8c205fa68c44c6f73887667808c6"} Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.222850 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="sg-core" containerID="cri-o://7655ace271d6f3feff7f2f4eaa16c5ea625e5fe6beab0c4038907b7f5e715987" gracePeriod=30 Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.223429 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.222892 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-notification-agent" containerID="cri-o://0b78f1753e8c4a22ed2f4147f06789b56f6db668b7a92626590701ea109bcd09" gracePeriod=30 Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.222879 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="proxy-httpd" containerID="cri-o://89db61a8e375fc043c0de0eeea971f98c1fb8c205fa68c44c6f73887667808c6" gracePeriod=30 Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.253216 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9285094919999999 podStartE2EDuration="6.253198906s" podCreationTimestamp="2026-03-12 18:24:35 +0000 UTC" firstStartedPulling="2026-03-12 18:24:36.078095456 +0000 UTC m=+1316.446721789" lastFinishedPulling="2026-03-12 18:24:40.40278484 +0000 UTC m=+1320.771411203" observedRunningTime="2026-03-12 18:24:41.244678318 +0000 UTC m=+1321.613304661" watchObservedRunningTime="2026-03-12 18:24:41.253198906 +0000 UTC m=+1321.621825239" Mar 12 18:24:41 crc kubenswrapper[4926]: I0312 18:24:41.307848 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.094060 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.236650 4926 generic.go:334] "Generic (PLEG): container finished" podID="f48005ff-c364-47aa-8253-9a3124097a10" containerID="89db61a8e375fc043c0de0eeea971f98c1fb8c205fa68c44c6f73887667808c6" exitCode=0 Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.236684 4926 generic.go:334] "Generic (PLEG): container finished" podID="f48005ff-c364-47aa-8253-9a3124097a10" containerID="7655ace271d6f3feff7f2f4eaa16c5ea625e5fe6beab0c4038907b7f5e715987" exitCode=2 Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.236692 4926 generic.go:334] "Generic (PLEG): container finished" podID="f48005ff-c364-47aa-8253-9a3124097a10" containerID="0b78f1753e8c4a22ed2f4147f06789b56f6db668b7a92626590701ea109bcd09" exitCode=0 Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.237322 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerDied","Data":"89db61a8e375fc043c0de0eeea971f98c1fb8c205fa68c44c6f73887667808c6"} Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.237378 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerDied","Data":"7655ace271d6f3feff7f2f4eaa16c5ea625e5fe6beab0c4038907b7f5e715987"} Mar 12 18:24:42 crc kubenswrapper[4926]: I0312 18:24:42.237393 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerDied","Data":"0b78f1753e8c4a22ed2f4147f06789b56f6db668b7a92626590701ea109bcd09"} Mar 12 18:24:45 crc kubenswrapper[4926]: I0312 18:24:45.878844 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:45 crc kubenswrapper[4926]: I0312 18:24:45.879632 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:45 crc kubenswrapper[4926]: I0312 18:24:45.920305 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:45 crc kubenswrapper[4926]: I0312 18:24:45.957123 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.276649 4926 generic.go:334] "Generic (PLEG): container finished" podID="436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" containerID="683758cbf77fef46bca458acf7d282b0867f1aa1a065fbc9925548f2b1c73316" exitCode=0 Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.276744 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" event={"ID":"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57","Type":"ContainerDied","Data":"683758cbf77fef46bca458acf7d282b0867f1aa1a065fbc9925548f2b1c73316"} Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.282113 4926 generic.go:334] "Generic (PLEG): container finished" podID="f48005ff-c364-47aa-8253-9a3124097a10" containerID="435ef2132df5554295889ec9ba808434afc58d13077a9a39e7222dd605149d55" exitCode=0 Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.282179 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerDied","Data":"435ef2132df5554295889ec9ba808434afc58d13077a9a39e7222dd605149d55"} Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.282225 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f48005ff-c364-47aa-8253-9a3124097a10","Type":"ContainerDied","Data":"f451adeb6303d8745b2e50f6138760aa3de2cccdedbd3b8de83c0f3d99a02905"} Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.282241 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f451adeb6303d8745b2e50f6138760aa3de2cccdedbd3b8de83c0f3d99a02905" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.282485 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.282520 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.297120 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482176 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq7gn\" (UniqueName: \"kubernetes.io/projected/f48005ff-c364-47aa-8253-9a3124097a10-kube-api-access-lq7gn\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482247 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-config-data\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482425 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-combined-ca-bundle\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482513 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-scripts\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482548 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-sg-core-conf-yaml\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482636 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-run-httpd\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.482669 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-log-httpd\") pod \"f48005ff-c364-47aa-8253-9a3124097a10\" (UID: \"f48005ff-c364-47aa-8253-9a3124097a10\") " Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.483326 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.483581 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.484046 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.484076 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f48005ff-c364-47aa-8253-9a3124097a10-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.488102 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f48005ff-c364-47aa-8253-9a3124097a10-kube-api-access-lq7gn" (OuterVolumeSpecName: "kube-api-access-lq7gn") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "kube-api-access-lq7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.488952 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-scripts" (OuterVolumeSpecName: "scripts") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.514243 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.561540 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.586301 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.586351 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.586370 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.586390 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq7gn\" (UniqueName: \"kubernetes.io/projected/f48005ff-c364-47aa-8253-9a3124097a10-kube-api-access-lq7gn\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.602049 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-config-data" (OuterVolumeSpecName: "config-data") pod "f48005ff-c364-47aa-8253-9a3124097a10" (UID: "f48005ff-c364-47aa-8253-9a3124097a10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:46 crc kubenswrapper[4926]: I0312 18:24:46.688531 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f48005ff-c364-47aa-8253-9a3124097a10-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.291594 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.349164 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.358723 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.374527 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:47 crc kubenswrapper[4926]: E0312 18:24:47.374996 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="proxy-httpd" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375015 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="proxy-httpd" Mar 12 18:24:47 crc kubenswrapper[4926]: E0312 18:24:47.375039 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="sg-core" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375048 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="sg-core" Mar 12 18:24:47 crc kubenswrapper[4926]: E0312 18:24:47.375062 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-central-agent" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375070 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-central-agent" Mar 12 18:24:47 crc kubenswrapper[4926]: E0312 18:24:47.375084 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-notification-agent" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375091 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-notification-agent" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375287 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="proxy-httpd" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375306 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-central-agent" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375317 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="ceilometer-notification-agent" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.375330 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48005ff-c364-47aa-8253-9a3124097a10" containerName="sg-core" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.382057 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.384217 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.386801 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.392064 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516480 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-scripts\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516539 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-log-httpd\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516566 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-config-data\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516640 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-run-httpd\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516662 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516686 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.516710 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbct5\" (UniqueName: \"kubernetes.io/projected/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-kube-api-access-dbct5\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619028 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-log-httpd\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619103 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-config-data\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619339 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-run-httpd\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619383 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619429 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619549 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbct5\" (UniqueName: \"kubernetes.io/projected/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-kube-api-access-dbct5\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619614 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-scripts\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.619882 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-run-httpd\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.620197 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-log-httpd\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.625297 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.627151 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.628129 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-scripts\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.634084 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-config-data\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.641968 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbct5\" (UniqueName: \"kubernetes.io/projected/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-kube-api-access-dbct5\") pod \"ceilometer-0\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.707733 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.741884 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.935178 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbq59\" (UniqueName: \"kubernetes.io/projected/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-kube-api-access-fbq59\") pod \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.935207 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-scripts\") pod \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.935246 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-config-data\") pod \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.935310 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-combined-ca-bundle\") pod \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\" (UID: \"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57\") " Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.940566 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-kube-api-access-fbq59" (OuterVolumeSpecName: "kube-api-access-fbq59") pod "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" (UID: "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57"). InnerVolumeSpecName "kube-api-access-fbq59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.944793 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-scripts" (OuterVolumeSpecName: "scripts") pod "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" (UID: "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.975989 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" (UID: "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:47 crc kubenswrapper[4926]: I0312 18:24:47.994386 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-config-data" (OuterVolumeSpecName: "config-data") pod "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" (UID: "436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.038653 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.038711 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbq59\" (UniqueName: \"kubernetes.io/projected/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-kube-api-access-fbq59\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.038733 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.038747 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.195530 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.303476 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerStarted","Data":"ab0418481e92c6856771c7ea5c23c736e2ec58cc43ca849b59c1411888deb2c2"} Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.305650 4926 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.306550 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.312682 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cbqv8" event={"ID":"436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57","Type":"ContainerDied","Data":"f726c742045749f4782471f8e9d8e4c6886b24fac2346f0142597eae7e927832"} Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.312752 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f726c742045749f4782471f8e9d8e4c6886b24fac2346f0142597eae7e927832" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.312778 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.448068 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 18:24:48 crc kubenswrapper[4926]: E0312 18:24:48.448374 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" containerName="nova-cell0-conductor-db-sync" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.448386 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" containerName="nova-cell0-conductor-db-sync" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.448591 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" containerName="nova-cell0-conductor-db-sync" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.449126 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.452875 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.453060 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-89pqc" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.462683 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.500096 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48005ff-c364-47aa-8253-9a3124097a10" path="/var/lib/kubelet/pods/f48005ff-c364-47aa-8253-9a3124097a10/volumes" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.551870 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/397b3426-1e6c-468a-8d6d-562e70944d9d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.551918 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7d6b\" (UniqueName: \"kubernetes.io/projected/397b3426-1e6c-468a-8d6d-562e70944d9d-kube-api-access-h7d6b\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.552005 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/397b3426-1e6c-468a-8d6d-562e70944d9d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.552747 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.655752 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/397b3426-1e6c-468a-8d6d-562e70944d9d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.655820 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7d6b\" (UniqueName: \"kubernetes.io/projected/397b3426-1e6c-468a-8d6d-562e70944d9d-kube-api-access-h7d6b\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.655917 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/397b3426-1e6c-468a-8d6d-562e70944d9d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.661287 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/397b3426-1e6c-468a-8d6d-562e70944d9d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.661732 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/397b3426-1e6c-468a-8d6d-562e70944d9d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.686232 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7d6b\" (UniqueName: \"kubernetes.io/projected/397b3426-1e6c-468a-8d6d-562e70944d9d-kube-api-access-h7d6b\") pod \"nova-cell0-conductor-0\" (UID: \"397b3426-1e6c-468a-8d6d-562e70944d9d\") " pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:48 crc kubenswrapper[4926]: I0312 18:24:48.765877 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:49 crc kubenswrapper[4926]: I0312 18:24:49.233267 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 18:24:49 crc kubenswrapper[4926]: I0312 18:24:49.317602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"397b3426-1e6c-468a-8d6d-562e70944d9d","Type":"ContainerStarted","Data":"58b6f95ba06f065609956e1b292ef3638775b20aa0e44627ce4cf95bbe74986b"} Mar 12 18:24:49 crc kubenswrapper[4926]: I0312 18:24:49.327072 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerStarted","Data":"5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d"} Mar 12 18:24:50 crc kubenswrapper[4926]: I0312 18:24:50.336858 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerStarted","Data":"5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d"} Mar 12 18:24:50 crc kubenswrapper[4926]: I0312 18:24:50.339678 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"397b3426-1e6c-468a-8d6d-562e70944d9d","Type":"ContainerStarted","Data":"6389117384c859ad32657d06cff7727c2a26c4645a8bf2642376f03ed6f98468"} Mar 12 18:24:50 crc kubenswrapper[4926]: I0312 18:24:50.341066 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:50 crc kubenswrapper[4926]: I0312 18:24:50.361954 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.361934057 podStartE2EDuration="2.361934057s" podCreationTimestamp="2026-03-12 18:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:24:50.355522326 +0000 UTC m=+1330.724148669" watchObservedRunningTime="2026-03-12 18:24:50.361934057 +0000 UTC m=+1330.730560390" Mar 12 18:24:51 crc kubenswrapper[4926]: I0312 18:24:51.354687 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerStarted","Data":"26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b"} Mar 12 18:24:53 crc kubenswrapper[4926]: I0312 18:24:53.381192 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerStarted","Data":"271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0"} Mar 12 18:24:53 crc kubenswrapper[4926]: I0312 18:24:53.383496 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:24:53 crc kubenswrapper[4926]: I0312 18:24:53.418084 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.087379613 podStartE2EDuration="6.418066725s" podCreationTimestamp="2026-03-12 18:24:47 +0000 UTC" firstStartedPulling="2026-03-12 18:24:48.287153933 +0000 UTC m=+1328.655780276" lastFinishedPulling="2026-03-12 18:24:52.617841045 +0000 UTC m=+1332.986467388" observedRunningTime="2026-03-12 18:24:53.412141739 +0000 UTC m=+1333.780768072" watchObservedRunningTime="2026-03-12 18:24:53.418066725 +0000 UTC m=+1333.786693058" Mar 12 18:24:56 crc kubenswrapper[4926]: I0312 18:24:56.818121 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:24:56 crc kubenswrapper[4926]: I0312 18:24:56.818490 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:24:56 crc kubenswrapper[4926]: I0312 18:24:56.818559 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:24:56 crc kubenswrapper[4926]: I0312 18:24:56.819491 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9728bd6132bdd9ab31a71d0a44779a02f515c39e712bb7cc4f8a85610efe739f"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:24:56 crc kubenswrapper[4926]: I0312 18:24:56.819566 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://9728bd6132bdd9ab31a71d0a44779a02f515c39e712bb7cc4f8a85610efe739f" gracePeriod=600 Mar 12 18:24:57 crc kubenswrapper[4926]: I0312 18:24:57.435593 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="9728bd6132bdd9ab31a71d0a44779a02f515c39e712bb7cc4f8a85610efe739f" exitCode=0 Mar 12 18:24:57 crc kubenswrapper[4926]: I0312 18:24:57.435667 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"9728bd6132bdd9ab31a71d0a44779a02f515c39e712bb7cc4f8a85610efe739f"} Mar 12 18:24:57 crc kubenswrapper[4926]: I0312 18:24:57.436141 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"759fd18072cdf8fcc7bc2d92cc950b5720a437d7e4487f5098fffd2244e21cde"} Mar 12 18:24:57 crc kubenswrapper[4926]: I0312 18:24:57.436164 4926 scope.go:117] "RemoveContainer" containerID="10c4816f4e2fc4ce2bc2183a633d9bc53980639515bfce0cf198e862b133fadb" Mar 12 18:24:58 crc kubenswrapper[4926]: I0312 18:24:58.799900 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.375063 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-88dpb"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.377080 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.379392 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.379999 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.386644 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-88dpb"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.559513 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.561242 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.563003 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-scripts\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.563037 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.563084 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxvsd\" (UniqueName: \"kubernetes.io/projected/d30718d9-b986-4490-9e43-56eaa459aeb5-kube-api-access-mxvsd\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.563134 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-config-data\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.572893 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.573043 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.594538 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.595962 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.610823 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.617821 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.655531 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.656613 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.659513 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.668890 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrtd\" (UniqueName: \"kubernetes.io/projected/522020cf-1556-4192-92c8-6cab42123da0-kube-api-access-hjrtd\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669010 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-scripts\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669034 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669084 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-config-data\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669111 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxvsd\" (UniqueName: \"kubernetes.io/projected/d30718d9-b986-4490-9e43-56eaa459aeb5-kube-api-access-mxvsd\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669143 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/522020cf-1556-4192-92c8-6cab42123da0-logs\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669177 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.669209 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-config-data\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.676482 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-scripts\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.690008 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.697669 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxvsd\" (UniqueName: \"kubernetes.io/projected/d30718d9-b986-4490-9e43-56eaa459aeb5-kube-api-access-mxvsd\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.698048 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-config-data\") pod \"nova-cell0-cell-mapping-88dpb\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.752644 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773500 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4307f4cd-b3d5-41cf-94ba-055b0f09c920-logs\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773544 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773583 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m94n\" (UniqueName: \"kubernetes.io/projected/4307f4cd-b3d5-41cf-94ba-055b0f09c920-kube-api-access-9m94n\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773602 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-config-data\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773661 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-config-data\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773683 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss276\" (UniqueName: \"kubernetes.io/projected/7c00983f-ac91-410c-9ee3-d55342198d70-kube-api-access-ss276\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773717 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-config-data\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773753 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/522020cf-1556-4192-92c8-6cab42123da0-logs\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773776 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773795 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.773822 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrtd\" (UniqueName: \"kubernetes.io/projected/522020cf-1556-4192-92c8-6cab42123da0-kube-api-access-hjrtd\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.779031 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/522020cf-1556-4192-92c8-6cab42123da0-logs\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.785955 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.801119 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-config-data\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.804501 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jwx72"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.805952 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.817852 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jwx72"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.831498 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.832692 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.840718 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.844703 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.848466 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrtd\" (UniqueName: \"kubernetes.io/projected/522020cf-1556-4192-92c8-6cab42123da0-kube-api-access-hjrtd\") pod \"nova-api-0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.875827 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m94n\" (UniqueName: \"kubernetes.io/projected/4307f4cd-b3d5-41cf-94ba-055b0f09c920-kube-api-access-9m94n\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.875868 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-config-data\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.875934 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-config-data\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.875959 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss276\" (UniqueName: \"kubernetes.io/projected/7c00983f-ac91-410c-9ee3-d55342198d70-kube-api-access-ss276\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.876024 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.876056 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4307f4cd-b3d5-41cf-94ba-055b0f09c920-logs\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.876072 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.877961 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4307f4cd-b3d5-41cf-94ba-055b0f09c920-logs\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.881398 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.884814 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-config-data\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.887932 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.893676 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-config-data\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.893716 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.895919 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss276\" (UniqueName: \"kubernetes.io/projected/7c00983f-ac91-410c-9ee3-d55342198d70-kube-api-access-ss276\") pod \"nova-scheduler-0\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.898361 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m94n\" (UniqueName: \"kubernetes.io/projected/4307f4cd-b3d5-41cf-94ba-055b0f09c920-kube-api-access-9m94n\") pod \"nova-metadata-0\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.927347 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.977474 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.977529 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.977555 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppkxw\" (UniqueName: \"kubernetes.io/projected/eba9290a-15ad-403b-901b-bab0917192dd-kube-api-access-ppkxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.977574 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.977607 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.977866 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.978004 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7xnp\" (UniqueName: \"kubernetes.io/projected/458c7cf7-4e8a-4272-8940-1d730293a0ca-kube-api-access-k7xnp\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.978038 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.978065 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.978141 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-config\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:24:59 crc kubenswrapper[4926]: I0312 18:24:59.997972 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.079917 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-config\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080258 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080308 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080340 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppkxw\" (UniqueName: \"kubernetes.io/projected/eba9290a-15ad-403b-901b-bab0917192dd-kube-api-access-ppkxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080365 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080402 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080523 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7xnp\" (UniqueName: \"kubernetes.io/projected/458c7cf7-4e8a-4272-8940-1d730293a0ca-kube-api-access-k7xnp\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080557 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.080593 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.081497 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.081923 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.083168 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.084384 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.084922 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-config\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.086522 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.086785 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.098017 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppkxw\" (UniqueName: \"kubernetes.io/projected/eba9290a-15ad-403b-901b-bab0917192dd-kube-api-access-ppkxw\") pod \"nova-cell1-novncproxy-0\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.104452 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7xnp\" (UniqueName: \"kubernetes.io/projected/458c7cf7-4e8a-4272-8940-1d730293a0ca-kube-api-access-k7xnp\") pod \"dnsmasq-dns-757b4f8459-jwx72\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.211403 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.215641 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.455376 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.581213 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.714145 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.776503 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-88dpb"] Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.797912 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2wcs"] Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.802488 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.808145 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.808622 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.809467 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2wcs"] Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.899460 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-config-data\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.899499 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-scripts\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.899625 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:00 crc kubenswrapper[4926]: I0312 18:25:00.899716 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t75rc\" (UniqueName: \"kubernetes.io/projected/b478ba05-2155-4b13-a58a-002be25403d0-kube-api-access-t75rc\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.001005 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.001154 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t75rc\" (UniqueName: \"kubernetes.io/projected/b478ba05-2155-4b13-a58a-002be25403d0-kube-api-access-t75rc\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.001236 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-config-data\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.001265 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-scripts\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.006667 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.014161 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-scripts\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.026198 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-config-data\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.026668 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t75rc\" (UniqueName: \"kubernetes.io/projected/b478ba05-2155-4b13-a58a-002be25403d0-kube-api-access-t75rc\") pod \"nova-cell1-conductor-db-sync-x2wcs\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.032221 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.051717 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jwx72"] Mar 12 18:25:01 crc kubenswrapper[4926]: W0312 18:25:01.052952 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod458c7cf7_4e8a_4272_8940_1d730293a0ca.slice/crio-820539f8ae4b6d013f20317aef2ceed339b3e7e027d5738aa8b884143c185f6f WatchSource:0}: Error finding container 820539f8ae4b6d013f20317aef2ceed339b3e7e027d5738aa8b884143c185f6f: Status 404 returned error can't find the container with id 820539f8ae4b6d013f20317aef2ceed339b3e7e027d5738aa8b884143c185f6f Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.203132 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:01 crc kubenswrapper[4926]: E0312 18:25:01.444777 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: decoding bearer token (last URL \"https://quay.io/v2/auth?account=openshift-release-dev%2Bocm_access_1b89217552bc42d1be3fb06a1aed001a&scope=repository%3Apodified-antelope-centos9%2Fopenstack-nova-novncproxy%3Apull&service=quay.io\", body start \"\"): unexpected end of JSON input" image="quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified" Mar 12 18:25:01 crc kubenswrapper[4926]: E0312 18:25:01.445332 4926 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell1-novncproxy-novncproxy,Image:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfbhc9h65ch67ch588h57bhd9hb9h98hbbh649h655h67ch55ch689h596h594h668h595h64fh68bh64ch66h696h669h576h56dh58bh59dh54dhffh8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-novncproxy-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ppkxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/vnc_lite.html,Port:{0 6080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell1-novncproxy-0_openstack(eba9290a-15ad-403b-901b-bab0917192dd): ErrImagePull: initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: decoding bearer token (last URL \"https://quay.io/v2/auth?account=openshift-release-dev%2Bocm_access_1b89217552bc42d1be3fb06a1aed001a&scope=repository%3Apodified-antelope-centos9%2Fopenstack-nova-novncproxy%3Apull&service=quay.io\", body start \"\"): unexpected end of JSON input" logger="UnhandledError" Mar 12 18:25:01 crc kubenswrapper[4926]: E0312 18:25:01.446800 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ErrImagePull: \"initializing source docker://quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified: decoding bearer token (last URL \\\"https://quay.io/v2/auth?account=openshift-release-dev%2Bocm_access_1b89217552bc42d1be3fb06a1aed001a&scope=repository%3Apodified-antelope-centos9%2Fopenstack-nova-novncproxy%3Apull&service=quay.io\\\", body start \\\"\\\"): unexpected end of JSON input\"" pod="openstack/nova-cell1-novncproxy-0" podUID="eba9290a-15ad-403b-901b-bab0917192dd" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.491888 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eba9290a-15ad-403b-901b-bab0917192dd","Type":"ContainerStarted","Data":"50f6a42234fb9c45d38c1b23471febadd40f957d027586ffd43709de0e0faeab"} Mar 12 18:25:01 crc kubenswrapper[4926]: E0312 18:25:01.497302 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified\\\"\"" pod="openstack/nova-cell1-novncproxy-0" podUID="eba9290a-15ad-403b-901b-bab0917192dd" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.502696 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c00983f-ac91-410c-9ee3-d55342198d70","Type":"ContainerStarted","Data":"8a2d87ff1b1f13bbcd71a5ed9af411076eaddd0c93ccca3397ca3bb1a0eb3af8"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.515764 4926 generic.go:334] "Generic (PLEG): container finished" podID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerID="12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491" exitCode=0 Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.515862 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" event={"ID":"458c7cf7-4e8a-4272-8940-1d730293a0ca","Type":"ContainerDied","Data":"12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.515888 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" event={"ID":"458c7cf7-4e8a-4272-8940-1d730293a0ca","Type":"ContainerStarted","Data":"820539f8ae4b6d013f20317aef2ceed339b3e7e027d5738aa8b884143c185f6f"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.526887 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4307f4cd-b3d5-41cf-94ba-055b0f09c920","Type":"ContainerStarted","Data":"2f3d265c00f5c2ef87695352db3e88d3c30936c8db618015345f1cf9afeddcc5"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.537339 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-88dpb" event={"ID":"d30718d9-b986-4490-9e43-56eaa459aeb5","Type":"ContainerStarted","Data":"2f4d726cd118a123d7b32175fc4e520ff68fb0c877630e02cc5267fde94f47a0"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.537381 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-88dpb" event={"ID":"d30718d9-b986-4490-9e43-56eaa459aeb5","Type":"ContainerStarted","Data":"c008177eb27a46bebec75159ab0b2a0dca1d32e65ccd34eeac95828b596395d8"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.545955 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"522020cf-1556-4192-92c8-6cab42123da0","Type":"ContainerStarted","Data":"42ea86479f09e663945c5ef83197d6d342bd9e261315d58ca9f838ea2ad55e1c"} Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.581867 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-88dpb" podStartSLOduration=2.581834546 podStartE2EDuration="2.581834546s" podCreationTimestamp="2026-03-12 18:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:01.557129521 +0000 UTC m=+1341.925755854" watchObservedRunningTime="2026-03-12 18:25:01.581834546 +0000 UTC m=+1341.950460879" Mar 12 18:25:01 crc kubenswrapper[4926]: I0312 18:25:01.716334 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2wcs"] Mar 12 18:25:01 crc kubenswrapper[4926]: W0312 18:25:01.724424 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb478ba05_2155_4b13_a58a_002be25403d0.slice/crio-7299f6360af3c9f86bc4fe3b2e6a46529a369ee63e25d1be1c606c3639d19f01 WatchSource:0}: Error finding container 7299f6360af3c9f86bc4fe3b2e6a46529a369ee63e25d1be1c606c3639d19f01: Status 404 returned error can't find the container with id 7299f6360af3c9f86bc4fe3b2e6a46529a369ee63e25d1be1c606c3639d19f01 Mar 12 18:25:02 crc kubenswrapper[4926]: I0312 18:25:02.565950 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" event={"ID":"b478ba05-2155-4b13-a58a-002be25403d0","Type":"ContainerStarted","Data":"14cbad167526774b9aa42516a514a2cffabd952a439286d5c8bfe698bbd81e8b"} Mar 12 18:25:02 crc kubenswrapper[4926]: I0312 18:25:02.566547 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" event={"ID":"b478ba05-2155-4b13-a58a-002be25403d0","Type":"ContainerStarted","Data":"7299f6360af3c9f86bc4fe3b2e6a46529a369ee63e25d1be1c606c3639d19f01"} Mar 12 18:25:02 crc kubenswrapper[4926]: I0312 18:25:02.570324 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" event={"ID":"458c7cf7-4e8a-4272-8940-1d730293a0ca","Type":"ContainerStarted","Data":"9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3"} Mar 12 18:25:02 crc kubenswrapper[4926]: E0312 18:25:02.571514 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell1-novncproxy-novncproxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified\\\"\"" pod="openstack/nova-cell1-novncproxy-0" podUID="eba9290a-15ad-403b-901b-bab0917192dd" Mar 12 18:25:02 crc kubenswrapper[4926]: I0312 18:25:02.582920 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" podStartSLOduration=2.582902079 podStartE2EDuration="2.582902079s" podCreationTimestamp="2026-03-12 18:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:02.580507544 +0000 UTC m=+1342.949133877" watchObservedRunningTime="2026-03-12 18:25:02.582902079 +0000 UTC m=+1342.951528412" Mar 12 18:25:02 crc kubenswrapper[4926]: I0312 18:25:02.597275 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" podStartSLOduration=3.59725317 podStartE2EDuration="3.59725317s" podCreationTimestamp="2026-03-12 18:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:02.595895497 +0000 UTC m=+1342.964521830" watchObservedRunningTime="2026-03-12 18:25:02.59725317 +0000 UTC m=+1342.965879503" Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.406459 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.447302 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.580130 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c00983f-ac91-410c-9ee3-d55342198d70","Type":"ContainerStarted","Data":"ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323"} Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.584842 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4307f4cd-b3d5-41cf-94ba-055b0f09c920","Type":"ContainerStarted","Data":"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91"} Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.587741 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"522020cf-1556-4192-92c8-6cab42123da0","Type":"ContainerStarted","Data":"2acf5477ea08c1fb97c59765c73d7abfdadc6189fbd1cf4549d66c1007c1d817"} Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.588805 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:03 crc kubenswrapper[4926]: I0312 18:25:03.613728 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.205749246 podStartE2EDuration="4.613702255s" podCreationTimestamp="2026-03-12 18:24:59 +0000 UTC" firstStartedPulling="2026-03-12 18:25:00.684905672 +0000 UTC m=+1341.053532015" lastFinishedPulling="2026-03-12 18:25:03.092858691 +0000 UTC m=+1343.461485024" observedRunningTime="2026-03-12 18:25:03.598229789 +0000 UTC m=+1343.966856142" watchObservedRunningTime="2026-03-12 18:25:03.613702255 +0000 UTC m=+1343.982328588" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.067912 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.203197 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-config-data\") pod \"eba9290a-15ad-403b-901b-bab0917192dd\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.203285 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppkxw\" (UniqueName: \"kubernetes.io/projected/eba9290a-15ad-403b-901b-bab0917192dd-kube-api-access-ppkxw\") pod \"eba9290a-15ad-403b-901b-bab0917192dd\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.203325 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-combined-ca-bundle\") pod \"eba9290a-15ad-403b-901b-bab0917192dd\" (UID: \"eba9290a-15ad-403b-901b-bab0917192dd\") " Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.209588 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba9290a-15ad-403b-901b-bab0917192dd-kube-api-access-ppkxw" (OuterVolumeSpecName: "kube-api-access-ppkxw") pod "eba9290a-15ad-403b-901b-bab0917192dd" (UID: "eba9290a-15ad-403b-901b-bab0917192dd"). InnerVolumeSpecName "kube-api-access-ppkxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.210342 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eba9290a-15ad-403b-901b-bab0917192dd" (UID: "eba9290a-15ad-403b-901b-bab0917192dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.222125 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-config-data" (OuterVolumeSpecName: "config-data") pod "eba9290a-15ad-403b-901b-bab0917192dd" (UID: "eba9290a-15ad-403b-901b-bab0917192dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.306030 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.306088 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppkxw\" (UniqueName: \"kubernetes.io/projected/eba9290a-15ad-403b-901b-bab0917192dd-kube-api-access-ppkxw\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.306110 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba9290a-15ad-403b-901b-bab0917192dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.598838 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4307f4cd-b3d5-41cf-94ba-055b0f09c920","Type":"ContainerStarted","Data":"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69"} Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.598917 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-log" containerID="cri-o://7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91" gracePeriod=30 Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.598955 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-metadata" containerID="cri-o://aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69" gracePeriod=30 Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.603419 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"522020cf-1556-4192-92c8-6cab42123da0","Type":"ContainerStarted","Data":"d312e71f700c8194066a4bfb0efbe92c4e2b9fbced6805317853fff15c35d5d6"} Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.609043 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eba9290a-15ad-403b-901b-bab0917192dd","Type":"ContainerDied","Data":"50f6a42234fb9c45d38c1b23471febadd40f957d027586ffd43709de0e0faeab"} Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.609119 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.625654 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.111126465 podStartE2EDuration="5.625632127s" podCreationTimestamp="2026-03-12 18:24:59 +0000 UTC" firstStartedPulling="2026-03-12 18:25:00.578843944 +0000 UTC m=+1340.947470277" lastFinishedPulling="2026-03-12 18:25:03.093349606 +0000 UTC m=+1343.461975939" observedRunningTime="2026-03-12 18:25:04.621706435 +0000 UTC m=+1344.990332768" watchObservedRunningTime="2026-03-12 18:25:04.625632127 +0000 UTC m=+1344.994258470" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.645615 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.993577968 podStartE2EDuration="5.645596254s" podCreationTimestamp="2026-03-12 18:24:59 +0000 UTC" firstStartedPulling="2026-03-12 18:25:00.467123649 +0000 UTC m=+1340.835749982" lastFinishedPulling="2026-03-12 18:25:03.119141935 +0000 UTC m=+1343.487768268" observedRunningTime="2026-03-12 18:25:04.645092568 +0000 UTC m=+1345.013718911" watchObservedRunningTime="2026-03-12 18:25:04.645596254 +0000 UTC m=+1345.014222597" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.689910 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.700348 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.710178 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.711748 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.718603 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.718821 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.718960 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.719479 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.814759 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.814814 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.814907 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmpz\" (UniqueName: \"kubernetes.io/projected/5990afe5-179c-401d-99eb-58b27e2bfc9e-kube-api-access-lhmpz\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.815028 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.815053 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.916758 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.916816 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.916961 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.916985 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.918141 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmpz\" (UniqueName: \"kubernetes.io/projected/5990afe5-179c-401d-99eb-58b27e2bfc9e-kube-api-access-lhmpz\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.921971 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.924107 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.924905 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.928900 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.928964 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.931179 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5990afe5-179c-401d-99eb-58b27e2bfc9e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.949354 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmpz\" (UniqueName: \"kubernetes.io/projected/5990afe5-179c-401d-99eb-58b27e2bfc9e-kube-api-access-lhmpz\") pod \"nova-cell1-novncproxy-0\" (UID: \"5990afe5-179c-401d-99eb-58b27e2bfc9e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:04 crc kubenswrapper[4926]: I0312 18:25:04.979002 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.077156 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.383324 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.536418 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-config-data\") pod \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.536880 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m94n\" (UniqueName: \"kubernetes.io/projected/4307f4cd-b3d5-41cf-94ba-055b0f09c920-kube-api-access-9m94n\") pod \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.537049 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4307f4cd-b3d5-41cf-94ba-055b0f09c920-logs\") pod \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.537094 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-combined-ca-bundle\") pod \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\" (UID: \"4307f4cd-b3d5-41cf-94ba-055b0f09c920\") " Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.537791 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4307f4cd-b3d5-41cf-94ba-055b0f09c920-logs" (OuterVolumeSpecName: "logs") pod "4307f4cd-b3d5-41cf-94ba-055b0f09c920" (UID: "4307f4cd-b3d5-41cf-94ba-055b0f09c920"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.541649 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4307f4cd-b3d5-41cf-94ba-055b0f09c920-kube-api-access-9m94n" (OuterVolumeSpecName: "kube-api-access-9m94n") pod "4307f4cd-b3d5-41cf-94ba-055b0f09c920" (UID: "4307f4cd-b3d5-41cf-94ba-055b0f09c920"). InnerVolumeSpecName "kube-api-access-9m94n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.569398 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-config-data" (OuterVolumeSpecName: "config-data") pod "4307f4cd-b3d5-41cf-94ba-055b0f09c920" (UID: "4307f4cd-b3d5-41cf-94ba-055b0f09c920"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.581590 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4307f4cd-b3d5-41cf-94ba-055b0f09c920" (UID: "4307f4cd-b3d5-41cf-94ba-055b0f09c920"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620663 4926 generic.go:334] "Generic (PLEG): container finished" podID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerID="aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69" exitCode=0 Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620695 4926 generic.go:334] "Generic (PLEG): container finished" podID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerID="7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91" exitCode=143 Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620728 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620763 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4307f4cd-b3d5-41cf-94ba-055b0f09c920","Type":"ContainerDied","Data":"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69"} Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620790 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4307f4cd-b3d5-41cf-94ba-055b0f09c920","Type":"ContainerDied","Data":"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91"} Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620799 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4307f4cd-b3d5-41cf-94ba-055b0f09c920","Type":"ContainerDied","Data":"2f3d265c00f5c2ef87695352db3e88d3c30936c8db618015345f1cf9afeddcc5"} Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.620844 4926 scope.go:117] "RemoveContainer" containerID="aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.639413 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4307f4cd-b3d5-41cf-94ba-055b0f09c920-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.639526 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.639545 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4307f4cd-b3d5-41cf-94ba-055b0f09c920-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.639563 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m94n\" (UniqueName: \"kubernetes.io/projected/4307f4cd-b3d5-41cf-94ba-055b0f09c920-kube-api-access-9m94n\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.657548 4926 scope.go:117] "RemoveContainer" containerID="7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.675608 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.711033 4926 scope.go:117] "RemoveContainer" containerID="aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69" Mar 12 18:25:05 crc kubenswrapper[4926]: E0312 18:25:05.711541 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69\": container with ID starting with aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69 not found: ID does not exist" containerID="aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.711605 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69"} err="failed to get container status \"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69\": rpc error: code = NotFound desc = could not find container \"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69\": container with ID starting with aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69 not found: ID does not exist" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.711634 4926 scope.go:117] "RemoveContainer" containerID="7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91" Mar 12 18:25:05 crc kubenswrapper[4926]: E0312 18:25:05.712071 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91\": container with ID starting with 7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91 not found: ID does not exist" containerID="7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.712226 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91"} err="failed to get container status \"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91\": rpc error: code = NotFound desc = could not find container \"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91\": container with ID starting with 7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91 not found: ID does not exist" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.712362 4926 scope.go:117] "RemoveContainer" containerID="aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.713016 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69"} err="failed to get container status \"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69\": rpc error: code = NotFound desc = could not find container \"aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69\": container with ID starting with aa88e9c555f20e87aa7f99a35587675471ba4c00aa7e3cf59b886406679b6d69 not found: ID does not exist" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.713086 4926 scope.go:117] "RemoveContainer" containerID="7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.713561 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91"} err="failed to get container status \"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91\": rpc error: code = NotFound desc = could not find container \"7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91\": container with ID starting with 7547facc65c6eefbe2350b5bf9bbb600c1c3ab2aee9b66cde1ae452f75229e91 not found: ID does not exist" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.731297 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.744615 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.753500 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:05 crc kubenswrapper[4926]: E0312 18:25:05.754020 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-metadata" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.754047 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-metadata" Mar 12 18:25:05 crc kubenswrapper[4926]: E0312 18:25:05.754079 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-log" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.754088 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-log" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.754309 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-metadata" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.754327 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" containerName="nova-metadata-log" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.755776 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.757840 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.759929 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.764210 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.945141 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.945207 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-config-data\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.945321 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.945346 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1344f618-3fb1-4718-ac28-aaac70801a43-logs\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:05 crc kubenswrapper[4926]: I0312 18:25:05.945375 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkw7f\" (UniqueName: \"kubernetes.io/projected/1344f618-3fb1-4718-ac28-aaac70801a43-kube-api-access-bkw7f\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.048033 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.048118 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-config-data\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.048283 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.048327 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1344f618-3fb1-4718-ac28-aaac70801a43-logs\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.048386 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkw7f\" (UniqueName: \"kubernetes.io/projected/1344f618-3fb1-4718-ac28-aaac70801a43-kube-api-access-bkw7f\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.049280 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1344f618-3fb1-4718-ac28-aaac70801a43-logs\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.056192 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-config-data\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.056802 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.067614 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.074921 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkw7f\" (UniqueName: \"kubernetes.io/projected/1344f618-3fb1-4718-ac28-aaac70801a43-kube-api-access-bkw7f\") pod \"nova-metadata-0\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.080119 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.502899 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4307f4cd-b3d5-41cf-94ba-055b0f09c920" path="/var/lib/kubelet/pods/4307f4cd-b3d5-41cf-94ba-055b0f09c920/volumes" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.504268 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba9290a-15ad-403b-901b-bab0917192dd" path="/var/lib/kubelet/pods/eba9290a-15ad-403b-901b-bab0917192dd/volumes" Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.565408 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:06 crc kubenswrapper[4926]: I0312 18:25:06.635189 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5990afe5-179c-401d-99eb-58b27e2bfc9e","Type":"ContainerStarted","Data":"9632a93bdd6e87ebaad7f0644944aca90fb862d669cf288ad2f9ce7ad904f639"} Mar 12 18:25:06 crc kubenswrapper[4926]: W0312 18:25:06.894270 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1344f618_3fb1_4718_ac28_aaac70801a43.slice/crio-21a532f650953a29a6610144508556c5c62c3cfee82633d3ccf1f934e89f4ad7 WatchSource:0}: Error finding container 21a532f650953a29a6610144508556c5c62c3cfee82633d3ccf1f934e89f4ad7: Status 404 returned error can't find the container with id 21a532f650953a29a6610144508556c5c62c3cfee82633d3ccf1f934e89f4ad7 Mar 12 18:25:07 crc kubenswrapper[4926]: I0312 18:25:07.649578 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5990afe5-179c-401d-99eb-58b27e2bfc9e","Type":"ContainerStarted","Data":"8004a0c4872d595eff541c18db2d835ce9f58755458fed592f03016d57654f99"} Mar 12 18:25:07 crc kubenswrapper[4926]: I0312 18:25:07.655428 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1344f618-3fb1-4718-ac28-aaac70801a43","Type":"ContainerStarted","Data":"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db"} Mar 12 18:25:07 crc kubenswrapper[4926]: I0312 18:25:07.655509 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1344f618-3fb1-4718-ac28-aaac70801a43","Type":"ContainerStarted","Data":"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88"} Mar 12 18:25:07 crc kubenswrapper[4926]: I0312 18:25:07.655523 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1344f618-3fb1-4718-ac28-aaac70801a43","Type":"ContainerStarted","Data":"21a532f650953a29a6610144508556c5c62c3cfee82633d3ccf1f934e89f4ad7"} Mar 12 18:25:07 crc kubenswrapper[4926]: I0312 18:25:07.679721 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.374399772 podStartE2EDuration="3.679704611s" podCreationTimestamp="2026-03-12 18:25:04 +0000 UTC" firstStartedPulling="2026-03-12 18:25:05.665424875 +0000 UTC m=+1346.034051198" lastFinishedPulling="2026-03-12 18:25:06.970729704 +0000 UTC m=+1347.339356037" observedRunningTime="2026-03-12 18:25:07.677081198 +0000 UTC m=+1348.045707541" watchObservedRunningTime="2026-03-12 18:25:07.679704611 +0000 UTC m=+1348.048330944" Mar 12 18:25:07 crc kubenswrapper[4926]: I0312 18:25:07.703598 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.70357994 podStartE2EDuration="2.70357994s" podCreationTimestamp="2026-03-12 18:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:07.701901317 +0000 UTC m=+1348.070527690" watchObservedRunningTime="2026-03-12 18:25:07.70357994 +0000 UTC m=+1348.072206273" Mar 12 18:25:08 crc kubenswrapper[4926]: I0312 18:25:08.680329 4926 generic.go:334] "Generic (PLEG): container finished" podID="d30718d9-b986-4490-9e43-56eaa459aeb5" containerID="2f4d726cd118a123d7b32175fc4e520ff68fb0c877630e02cc5267fde94f47a0" exitCode=0 Mar 12 18:25:08 crc kubenswrapper[4926]: I0312 18:25:08.680402 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-88dpb" event={"ID":"d30718d9-b986-4490-9e43-56eaa459aeb5","Type":"ContainerDied","Data":"2f4d726cd118a123d7b32175fc4e520ff68fb0c877630e02cc5267fde94f47a0"} Mar 12 18:25:09 crc kubenswrapper[4926]: I0312 18:25:09.699623 4926 generic.go:334] "Generic (PLEG): container finished" podID="b478ba05-2155-4b13-a58a-002be25403d0" containerID="14cbad167526774b9aa42516a514a2cffabd952a439286d5c8bfe698bbd81e8b" exitCode=0 Mar 12 18:25:09 crc kubenswrapper[4926]: I0312 18:25:09.699775 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" event={"ID":"b478ba05-2155-4b13-a58a-002be25403d0","Type":"ContainerDied","Data":"14cbad167526774b9aa42516a514a2cffabd952a439286d5c8bfe698bbd81e8b"} Mar 12 18:25:09 crc kubenswrapper[4926]: I0312 18:25:09.889474 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 18:25:09 crc kubenswrapper[4926]: I0312 18:25:09.889713 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 18:25:09 crc kubenswrapper[4926]: I0312 18:25:09.979268 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.016783 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.077999 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.140853 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.213654 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.233093 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-scripts\") pod \"d30718d9-b986-4490-9e43-56eaa459aeb5\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.233164 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-config-data\") pod \"d30718d9-b986-4490-9e43-56eaa459aeb5\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.233235 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-combined-ca-bundle\") pod \"d30718d9-b986-4490-9e43-56eaa459aeb5\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.233330 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxvsd\" (UniqueName: \"kubernetes.io/projected/d30718d9-b986-4490-9e43-56eaa459aeb5-kube-api-access-mxvsd\") pod \"d30718d9-b986-4490-9e43-56eaa459aeb5\" (UID: \"d30718d9-b986-4490-9e43-56eaa459aeb5\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.251226 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30718d9-b986-4490-9e43-56eaa459aeb5-kube-api-access-mxvsd" (OuterVolumeSpecName: "kube-api-access-mxvsd") pod "d30718d9-b986-4490-9e43-56eaa459aeb5" (UID: "d30718d9-b986-4490-9e43-56eaa459aeb5"). InnerVolumeSpecName "kube-api-access-mxvsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.284020 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-scripts" (OuterVolumeSpecName: "scripts") pod "d30718d9-b986-4490-9e43-56eaa459aeb5" (UID: "d30718d9-b986-4490-9e43-56eaa459aeb5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.313244 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-rln7t"] Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.313964 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerName="dnsmasq-dns" containerID="cri-o://ad9f00f4a3b221ab329deab372c49c3816c4c0e0f8a0ad87e7aefa0a45c31c5b" gracePeriod=10 Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.321974 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-config-data" (OuterVolumeSpecName: "config-data") pod "d30718d9-b986-4490-9e43-56eaa459aeb5" (UID: "d30718d9-b986-4490-9e43-56eaa459aeb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.336514 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d30718d9-b986-4490-9e43-56eaa459aeb5" (UID: "d30718d9-b986-4490-9e43-56eaa459aeb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.341198 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.341225 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.341236 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d30718d9-b986-4490-9e43-56eaa459aeb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.341248 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxvsd\" (UniqueName: \"kubernetes.io/projected/d30718d9-b986-4490-9e43-56eaa459aeb5-kube-api-access-mxvsd\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: E0312 18:25:10.557985 4926 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd30718d9_b986_4490_9e43_56eaa459aeb5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ab53a0_e811_4b31_8a1e_d71be4115b2c.slice/crio-ad9f00f4a3b221ab329deab372c49c3816c4c0e0f8a0ad87e7aefa0a45c31c5b.scope\": RecentStats: unable to find data in memory cache]" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.712308 4926 generic.go:334] "Generic (PLEG): container finished" podID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerID="ad9f00f4a3b221ab329deab372c49c3816c4c0e0f8a0ad87e7aefa0a45c31c5b" exitCode=0 Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.712384 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" event={"ID":"75ab53a0-e811-4b31-8a1e-d71be4115b2c","Type":"ContainerDied","Data":"ad9f00f4a3b221ab329deab372c49c3816c4c0e0f8a0ad87e7aefa0a45c31c5b"} Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.714265 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-88dpb" event={"ID":"d30718d9-b986-4490-9e43-56eaa459aeb5","Type":"ContainerDied","Data":"c008177eb27a46bebec75159ab0b2a0dca1d32e65ccd34eeac95828b596395d8"} Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.714314 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c008177eb27a46bebec75159ab0b2a0dca1d32e65ccd34eeac95828b596395d8" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.714369 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-88dpb" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.752361 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.765487 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.850784 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-config\") pod \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.850856 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-sb\") pod \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.850980 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-swift-storage-0\") pod \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.851011 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-svc\") pod \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.851101 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm8xc\" (UniqueName: \"kubernetes.io/projected/75ab53a0-e811-4b31-8a1e-d71be4115b2c-kube-api-access-sm8xc\") pod \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.851135 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-nb\") pod \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\" (UID: \"75ab53a0-e811-4b31-8a1e-d71be4115b2c\") " Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.855875 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ab53a0-e811-4b31-8a1e-d71be4115b2c-kube-api-access-sm8xc" (OuterVolumeSpecName: "kube-api-access-sm8xc") pod "75ab53a0-e811-4b31-8a1e-d71be4115b2c" (UID: "75ab53a0-e811-4b31-8a1e-d71be4115b2c"). InnerVolumeSpecName "kube-api-access-sm8xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.912927 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75ab53a0-e811-4b31-8a1e-d71be4115b2c" (UID: "75ab53a0-e811-4b31-8a1e-d71be4115b2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.929652 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75ab53a0-e811-4b31-8a1e-d71be4115b2c" (UID: "75ab53a0-e811-4b31-8a1e-d71be4115b2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.931978 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-config" (OuterVolumeSpecName: "config") pod "75ab53a0-e811-4b31-8a1e-d71be4115b2c" (UID: "75ab53a0-e811-4b31-8a1e-d71be4115b2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.947747 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75ab53a0-e811-4b31-8a1e-d71be4115b2c" (UID: "75ab53a0-e811-4b31-8a1e-d71be4115b2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.950339 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75ab53a0-e811-4b31-8a1e-d71be4115b2c" (UID: "75ab53a0-e811-4b31-8a1e-d71be4115b2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.968731 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm8xc\" (UniqueName: \"kubernetes.io/projected/75ab53a0-e811-4b31-8a1e-d71be4115b2c-kube-api-access-sm8xc\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.968765 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.968775 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.968783 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.968792 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:10 crc kubenswrapper[4926]: I0312 18:25:10.968800 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75ab53a0-e811-4b31-8a1e-d71be4115b2c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.007107 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.007449 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.019552 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.019908 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-log" containerID="cri-o://2acf5477ea08c1fb97c59765c73d7abfdadc6189fbd1cf4549d66c1007c1d817" gracePeriod=30 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.020328 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-api" containerID="cri-o://d312e71f700c8194066a4bfb0efbe92c4e2b9fbced6805317853fff15c35d5d6" gracePeriod=30 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.042254 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.042482 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-log" containerID="cri-o://cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88" gracePeriod=30 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.042553 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-metadata" containerID="cri-o://f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db" gracePeriod=30 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.080544 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.080616 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.111882 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.273769 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-combined-ca-bundle\") pod \"b478ba05-2155-4b13-a58a-002be25403d0\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.273972 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-scripts\") pod \"b478ba05-2155-4b13-a58a-002be25403d0\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.274461 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t75rc\" (UniqueName: \"kubernetes.io/projected/b478ba05-2155-4b13-a58a-002be25403d0-kube-api-access-t75rc\") pod \"b478ba05-2155-4b13-a58a-002be25403d0\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.274488 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-config-data\") pod \"b478ba05-2155-4b13-a58a-002be25403d0\" (UID: \"b478ba05-2155-4b13-a58a-002be25403d0\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.288590 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.289203 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-scripts" (OuterVolumeSpecName: "scripts") pod "b478ba05-2155-4b13-a58a-002be25403d0" (UID: "b478ba05-2155-4b13-a58a-002be25403d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.289284 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b478ba05-2155-4b13-a58a-002be25403d0-kube-api-access-t75rc" (OuterVolumeSpecName: "kube-api-access-t75rc") pod "b478ba05-2155-4b13-a58a-002be25403d0" (UID: "b478ba05-2155-4b13-a58a-002be25403d0"). InnerVolumeSpecName "kube-api-access-t75rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.309200 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b478ba05-2155-4b13-a58a-002be25403d0" (UID: "b478ba05-2155-4b13-a58a-002be25403d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.322648 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-config-data" (OuterVolumeSpecName: "config-data") pod "b478ba05-2155-4b13-a58a-002be25403d0" (UID: "b478ba05-2155-4b13-a58a-002be25403d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.377958 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.378212 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t75rc\" (UniqueName: \"kubernetes.io/projected/b478ba05-2155-4b13-a58a-002be25403d0-kube-api-access-t75rc\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.378225 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.378236 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b478ba05-2155-4b13-a58a-002be25403d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.567868 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.687542 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-combined-ca-bundle\") pod \"1344f618-3fb1-4718-ac28-aaac70801a43\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.687685 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-config-data\") pod \"1344f618-3fb1-4718-ac28-aaac70801a43\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.687905 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1344f618-3fb1-4718-ac28-aaac70801a43-logs\") pod \"1344f618-3fb1-4718-ac28-aaac70801a43\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.688363 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1344f618-3fb1-4718-ac28-aaac70801a43-logs" (OuterVolumeSpecName: "logs") pod "1344f618-3fb1-4718-ac28-aaac70801a43" (UID: "1344f618-3fb1-4718-ac28-aaac70801a43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.688430 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-nova-metadata-tls-certs\") pod \"1344f618-3fb1-4718-ac28-aaac70801a43\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.688704 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkw7f\" (UniqueName: \"kubernetes.io/projected/1344f618-3fb1-4718-ac28-aaac70801a43-kube-api-access-bkw7f\") pod \"1344f618-3fb1-4718-ac28-aaac70801a43\" (UID: \"1344f618-3fb1-4718-ac28-aaac70801a43\") " Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.689840 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1344f618-3fb1-4718-ac28-aaac70801a43-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.691316 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1344f618-3fb1-4718-ac28-aaac70801a43-kube-api-access-bkw7f" (OuterVolumeSpecName: "kube-api-access-bkw7f") pod "1344f618-3fb1-4718-ac28-aaac70801a43" (UID: "1344f618-3fb1-4718-ac28-aaac70801a43"). InnerVolumeSpecName "kube-api-access-bkw7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.717527 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-config-data" (OuterVolumeSpecName: "config-data") pod "1344f618-3fb1-4718-ac28-aaac70801a43" (UID: "1344f618-3fb1-4718-ac28-aaac70801a43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.722332 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1344f618-3fb1-4718-ac28-aaac70801a43" (UID: "1344f618-3fb1-4718-ac28-aaac70801a43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.724814 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.724810 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-x2wcs" event={"ID":"b478ba05-2155-4b13-a58a-002be25403d0","Type":"ContainerDied","Data":"7299f6360af3c9f86bc4fe3b2e6a46529a369ee63e25d1be1c606c3639d19f01"} Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.724924 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7299f6360af3c9f86bc4fe3b2e6a46529a369ee63e25d1be1c606c3639d19f01" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727323 4926 generic.go:334] "Generic (PLEG): container finished" podID="1344f618-3fb1-4718-ac28-aaac70801a43" containerID="f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db" exitCode=0 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727350 4926 generic.go:334] "Generic (PLEG): container finished" podID="1344f618-3fb1-4718-ac28-aaac70801a43" containerID="cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88" exitCode=143 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727396 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1344f618-3fb1-4718-ac28-aaac70801a43","Type":"ContainerDied","Data":"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db"} Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727418 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1344f618-3fb1-4718-ac28-aaac70801a43","Type":"ContainerDied","Data":"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88"} Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727431 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1344f618-3fb1-4718-ac28-aaac70801a43","Type":"ContainerDied","Data":"21a532f650953a29a6610144508556c5c62c3cfee82633d3ccf1f934e89f4ad7"} Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727469 4926 scope.go:117] "RemoveContainer" containerID="f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.727604 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.732933 4926 generic.go:334] "Generic (PLEG): container finished" podID="522020cf-1556-4192-92c8-6cab42123da0" containerID="2acf5477ea08c1fb97c59765c73d7abfdadc6189fbd1cf4549d66c1007c1d817" exitCode=143 Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.733001 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"522020cf-1556-4192-92c8-6cab42123da0","Type":"ContainerDied","Data":"2acf5477ea08c1fb97c59765c73d7abfdadc6189fbd1cf4549d66c1007c1d817"} Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.739752 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1344f618-3fb1-4718-ac28-aaac70801a43" (UID: "1344f618-3fb1-4718-ac28-aaac70801a43"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.756092 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" event={"ID":"75ab53a0-e811-4b31-8a1e-d71be4115b2c","Type":"ContainerDied","Data":"2c7e6fa4e701c31af5cac5d50047d5137f377c93ee8e669dc7d858e4f08b2574"} Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.756119 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-rln7t" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.773141 4926 scope.go:117] "RemoveContainer" containerID="cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.792756 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.792783 4926 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.792795 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkw7f\" (UniqueName: \"kubernetes.io/projected/1344f618-3fb1-4718-ac28-aaac70801a43-kube-api-access-bkw7f\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.792803 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1344f618-3fb1-4718-ac28-aaac70801a43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.803698 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-rln7t"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.813664 4926 scope.go:117] "RemoveContainer" containerID="f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.814824 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db\": container with ID starting with f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db not found: ID does not exist" containerID="f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.814876 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db"} err="failed to get container status \"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db\": rpc error: code = NotFound desc = could not find container \"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db\": container with ID starting with f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db not found: ID does not exist" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.814897 4926 scope.go:117] "RemoveContainer" containerID="cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.816765 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88\": container with ID starting with cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88 not found: ID does not exist" containerID="cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.816800 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88"} err="failed to get container status \"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88\": rpc error: code = NotFound desc = could not find container \"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88\": container with ID starting with cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88 not found: ID does not exist" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.816818 4926 scope.go:117] "RemoveContainer" containerID="f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.817071 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db"} err="failed to get container status \"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db\": rpc error: code = NotFound desc = could not find container \"f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db\": container with ID starting with f1295472ac240b745f8f6e8144f3cf3a94d1eb3adaa6cb76a9497a26654594db not found: ID does not exist" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.817123 4926 scope.go:117] "RemoveContainer" containerID="cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.817417 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88"} err="failed to get container status \"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88\": rpc error: code = NotFound desc = could not find container \"cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88\": container with ID starting with cc38b398f5f07a27852a773cc63f78a5c3c6858e17355598c8a7599e76f24a88 not found: ID does not exist" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.817537 4926 scope.go:117] "RemoveContainer" containerID="ad9f00f4a3b221ab329deab372c49c3816c4c0e0f8a0ad87e7aefa0a45c31c5b" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.820903 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-rln7t"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.842950 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843418 4926 scope.go:117] "RemoveContainer" containerID="ba45c947e1da77d586b46aef916a37581f794d00be599bae5b1f6a31185bfcd3" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.843783 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerName="init" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843802 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerName="init" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.843816 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerName="dnsmasq-dns" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843822 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerName="dnsmasq-dns" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.843837 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-metadata" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843843 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-metadata" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.843855 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b478ba05-2155-4b13-a58a-002be25403d0" containerName="nova-cell1-conductor-db-sync" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843860 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="b478ba05-2155-4b13-a58a-002be25403d0" containerName="nova-cell1-conductor-db-sync" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.843873 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-log" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843878 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-log" Mar 12 18:25:11 crc kubenswrapper[4926]: E0312 18:25:11.843897 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30718d9-b986-4490-9e43-56eaa459aeb5" containerName="nova-manage" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.843902 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30718d9-b986-4490-9e43-56eaa459aeb5" containerName="nova-manage" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.844077 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30718d9-b986-4490-9e43-56eaa459aeb5" containerName="nova-manage" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.844087 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-log" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.844094 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" containerName="dnsmasq-dns" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.844110 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="b478ba05-2155-4b13-a58a-002be25403d0" containerName="nova-cell1-conductor-db-sync" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.844117 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" containerName="nova-metadata-metadata" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.846034 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.850865 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.851343 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.894667 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r866l\" (UniqueName: \"kubernetes.io/projected/9696ed5e-263b-457b-a771-9f7a7e27dbed-kube-api-access-r866l\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.894728 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9696ed5e-263b-457b-a771-9f7a7e27dbed-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.894753 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9696ed5e-263b-457b-a771-9f7a7e27dbed-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.996853 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r866l\" (UniqueName: \"kubernetes.io/projected/9696ed5e-263b-457b-a771-9f7a7e27dbed-kube-api-access-r866l\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.996914 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9696ed5e-263b-457b-a771-9f7a7e27dbed-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:11 crc kubenswrapper[4926]: I0312 18:25:11.996934 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9696ed5e-263b-457b-a771-9f7a7e27dbed-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.000674 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9696ed5e-263b-457b-a771-9f7a7e27dbed-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.001425 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9696ed5e-263b-457b-a771-9f7a7e27dbed-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.016362 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r866l\" (UniqueName: \"kubernetes.io/projected/9696ed5e-263b-457b-a771-9f7a7e27dbed-kube-api-access-r866l\") pod \"nova-cell1-conductor-0\" (UID: \"9696ed5e-263b-457b-a771-9f7a7e27dbed\") " pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.140930 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.152919 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.169779 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.171807 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.172328 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.174386 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.174661 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.180934 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.301881 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zmx\" (UniqueName: \"kubernetes.io/projected/f6b42e45-6f11-4207-b26d-7befa423860f-kube-api-access-b4zmx\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.301957 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b42e45-6f11-4207-b26d-7befa423860f-logs\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.301987 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.302056 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-config-data\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.302134 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.404288 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b42e45-6f11-4207-b26d-7befa423860f-logs\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.404344 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.404409 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-config-data\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.404490 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.404557 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zmx\" (UniqueName: \"kubernetes.io/projected/f6b42e45-6f11-4207-b26d-7befa423860f-kube-api-access-b4zmx\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.405410 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b42e45-6f11-4207-b26d-7befa423860f-logs\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.411701 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-config-data\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.412006 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.417913 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.425941 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zmx\" (UniqueName: \"kubernetes.io/projected/f6b42e45-6f11-4207-b26d-7befa423860f-kube-api-access-b4zmx\") pod \"nova-metadata-0\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.505544 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1344f618-3fb1-4718-ac28-aaac70801a43" path="/var/lib/kubelet/pods/1344f618-3fb1-4718-ac28-aaac70801a43/volumes" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.507002 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ab53a0-e811-4b31-8a1e-d71be4115b2c" path="/var/lib/kubelet/pods/75ab53a0-e811-4b31-8a1e-d71be4115b2c/volumes" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.586638 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.660559 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 18:25:12 crc kubenswrapper[4926]: W0312 18:25:12.662220 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9696ed5e_263b_457b_a771_9f7a7e27dbed.slice/crio-ac5d920764343c3822aea3eec4f980699931e85fcb37936f0c6999814073f8f3 WatchSource:0}: Error finding container ac5d920764343c3822aea3eec4f980699931e85fcb37936f0c6999814073f8f3: Status 404 returned error can't find the container with id ac5d920764343c3822aea3eec4f980699931e85fcb37936f0c6999814073f8f3 Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.801231 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7c00983f-ac91-410c-9ee3-d55342198d70" containerName="nova-scheduler-scheduler" containerID="cri-o://ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" gracePeriod=30 Mar 12 18:25:12 crc kubenswrapper[4926]: I0312 18:25:12.801563 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9696ed5e-263b-457b-a771-9f7a7e27dbed","Type":"ContainerStarted","Data":"ac5d920764343c3822aea3eec4f980699931e85fcb37936f0c6999814073f8f3"} Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.081907 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.812412 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9696ed5e-263b-457b-a771-9f7a7e27dbed","Type":"ContainerStarted","Data":"bb9e08c6749891ff5759e1975e0508d47c8fe6e0741407ca15c1c009eb4de943"} Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.813771 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.819132 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6b42e45-6f11-4207-b26d-7befa423860f","Type":"ContainerStarted","Data":"fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6"} Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.819268 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6b42e45-6f11-4207-b26d-7befa423860f","Type":"ContainerStarted","Data":"b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28"} Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.819353 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6b42e45-6f11-4207-b26d-7befa423860f","Type":"ContainerStarted","Data":"0161e0bff07323bc954df2ec0573f151d9db197c8f01a596b3666178a9a120fe"} Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.831859 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8318353480000003 podStartE2EDuration="2.831835348s" podCreationTimestamp="2026-03-12 18:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:13.827416899 +0000 UTC m=+1354.196043232" watchObservedRunningTime="2026-03-12 18:25:13.831835348 +0000 UTC m=+1354.200461681" Mar 12 18:25:13 crc kubenswrapper[4926]: I0312 18:25:13.854359 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.8543362540000001 podStartE2EDuration="1.854336254s" podCreationTimestamp="2026-03-12 18:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:13.846390544 +0000 UTC m=+1354.215016897" watchObservedRunningTime="2026-03-12 18:25:13.854336254 +0000 UTC m=+1354.222962587" Mar 12 18:25:14 crc kubenswrapper[4926]: E0312 18:25:14.980380 4926 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323 is running failed: container process not found" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 18:25:14 crc kubenswrapper[4926]: E0312 18:25:14.981558 4926 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323 is running failed: container process not found" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 18:25:14 crc kubenswrapper[4926]: E0312 18:25:14.981915 4926 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323 is running failed: container process not found" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 18:25:14 crc kubenswrapper[4926]: E0312 18:25:14.982053 4926 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7c00983f-ac91-410c-9ee3-d55342198d70" containerName="nova-scheduler-scheduler" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.078155 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.096914 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.429637 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.475279 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-combined-ca-bundle\") pod \"7c00983f-ac91-410c-9ee3-d55342198d70\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.475384 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-config-data\") pod \"7c00983f-ac91-410c-9ee3-d55342198d70\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.475597 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss276\" (UniqueName: \"kubernetes.io/projected/7c00983f-ac91-410c-9ee3-d55342198d70-kube-api-access-ss276\") pod \"7c00983f-ac91-410c-9ee3-d55342198d70\" (UID: \"7c00983f-ac91-410c-9ee3-d55342198d70\") " Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.484757 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c00983f-ac91-410c-9ee3-d55342198d70-kube-api-access-ss276" (OuterVolumeSpecName: "kube-api-access-ss276") pod "7c00983f-ac91-410c-9ee3-d55342198d70" (UID: "7c00983f-ac91-410c-9ee3-d55342198d70"). InnerVolumeSpecName "kube-api-access-ss276". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.523182 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-config-data" (OuterVolumeSpecName: "config-data") pod "7c00983f-ac91-410c-9ee3-d55342198d70" (UID: "7c00983f-ac91-410c-9ee3-d55342198d70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.527101 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c00983f-ac91-410c-9ee3-d55342198d70" (UID: "7c00983f-ac91-410c-9ee3-d55342198d70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.578655 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss276\" (UniqueName: \"kubernetes.io/projected/7c00983f-ac91-410c-9ee3-d55342198d70-kube-api-access-ss276\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.578686 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.578700 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c00983f-ac91-410c-9ee3-d55342198d70-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.841746 4926 generic.go:334] "Generic (PLEG): container finished" podID="7c00983f-ac91-410c-9ee3-d55342198d70" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" exitCode=0 Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.841807 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.841820 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c00983f-ac91-410c-9ee3-d55342198d70","Type":"ContainerDied","Data":"ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323"} Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.841881 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c00983f-ac91-410c-9ee3-d55342198d70","Type":"ContainerDied","Data":"8a2d87ff1b1f13bbcd71a5ed9af411076eaddd0c93ccca3397ca3bb1a0eb3af8"} Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.841903 4926 scope.go:117] "RemoveContainer" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.884115 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.884265 4926 scope.go:117] "RemoveContainer" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.885349 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 18:25:15 crc kubenswrapper[4926]: E0312 18:25:15.893750 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323\": container with ID starting with ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323 not found: ID does not exist" containerID="ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.893793 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323"} err="failed to get container status \"ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323\": rpc error: code = NotFound desc = could not find container \"ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323\": container with ID starting with ea5ca88836de9cfdc6fc5b152d149a335a9b476f1550663f5769f0a8fc968323 not found: ID does not exist" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.914142 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.926394 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:15 crc kubenswrapper[4926]: E0312 18:25:15.926849 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c00983f-ac91-410c-9ee3-d55342198d70" containerName="nova-scheduler-scheduler" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.926869 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c00983f-ac91-410c-9ee3-d55342198d70" containerName="nova-scheduler-scheduler" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.927121 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c00983f-ac91-410c-9ee3-d55342198d70" containerName="nova-scheduler-scheduler" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.927822 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.929945 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.941606 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.987089 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xl7g\" (UniqueName: \"kubernetes.io/projected/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-kube-api-access-4xl7g\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.987243 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-config-data\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:15 crc kubenswrapper[4926]: I0312 18:25:15.987304 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.089786 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-config-data\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.089856 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.089928 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xl7g\" (UniqueName: \"kubernetes.io/projected/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-kube-api-access-4xl7g\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.093828 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.094081 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-config-data\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.104327 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xl7g\" (UniqueName: \"kubernetes.io/projected/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-kube-api-access-4xl7g\") pod \"nova-scheduler-0\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.253297 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.502087 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c00983f-ac91-410c-9ee3-d55342198d70" path="/var/lib/kubelet/pods/7c00983f-ac91-410c-9ee3-d55342198d70/volumes" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.753180 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:16 crc kubenswrapper[4926]: W0312 18:25:16.764713 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e2c8ff_6e0f_4a2b_a6d5_e2c4cf9c97a3.slice/crio-7a7d155b546588f16ab2cdd1af883265350417efed1c7aef091311c1c7d02134 WatchSource:0}: Error finding container 7a7d155b546588f16ab2cdd1af883265350417efed1c7aef091311c1c7d02134: Status 404 returned error can't find the container with id 7a7d155b546588f16ab2cdd1af883265350417efed1c7aef091311c1c7d02134 Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.854485 4926 generic.go:334] "Generic (PLEG): container finished" podID="522020cf-1556-4192-92c8-6cab42123da0" containerID="d312e71f700c8194066a4bfb0efbe92c4e2b9fbced6805317853fff15c35d5d6" exitCode=0 Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.854602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"522020cf-1556-4192-92c8-6cab42123da0","Type":"ContainerDied","Data":"d312e71f700c8194066a4bfb0efbe92c4e2b9fbced6805317853fff15c35d5d6"} Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.854631 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"522020cf-1556-4192-92c8-6cab42123da0","Type":"ContainerDied","Data":"42ea86479f09e663945c5ef83197d6d342bd9e261315d58ca9f838ea2ad55e1c"} Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.854644 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42ea86479f09e663945c5ef83197d6d342bd9e261315d58ca9f838ea2ad55e1c" Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.856360 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3","Type":"ContainerStarted","Data":"7a7d155b546588f16ab2cdd1af883265350417efed1c7aef091311c1c7d02134"} Mar 12 18:25:16 crc kubenswrapper[4926]: I0312 18:25:16.908851 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.007718 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjrtd\" (UniqueName: \"kubernetes.io/projected/522020cf-1556-4192-92c8-6cab42123da0-kube-api-access-hjrtd\") pod \"522020cf-1556-4192-92c8-6cab42123da0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.007844 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-combined-ca-bundle\") pod \"522020cf-1556-4192-92c8-6cab42123da0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.007906 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-config-data\") pod \"522020cf-1556-4192-92c8-6cab42123da0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.007959 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/522020cf-1556-4192-92c8-6cab42123da0-logs\") pod \"522020cf-1556-4192-92c8-6cab42123da0\" (UID: \"522020cf-1556-4192-92c8-6cab42123da0\") " Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.009753 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/522020cf-1556-4192-92c8-6cab42123da0-logs" (OuterVolumeSpecName: "logs") pod "522020cf-1556-4192-92c8-6cab42123da0" (UID: "522020cf-1556-4192-92c8-6cab42123da0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.011407 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522020cf-1556-4192-92c8-6cab42123da0-kube-api-access-hjrtd" (OuterVolumeSpecName: "kube-api-access-hjrtd") pod "522020cf-1556-4192-92c8-6cab42123da0" (UID: "522020cf-1556-4192-92c8-6cab42123da0"). InnerVolumeSpecName "kube-api-access-hjrtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.032984 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "522020cf-1556-4192-92c8-6cab42123da0" (UID: "522020cf-1556-4192-92c8-6cab42123da0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.052200 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-config-data" (OuterVolumeSpecName: "config-data") pod "522020cf-1556-4192-92c8-6cab42123da0" (UID: "522020cf-1556-4192-92c8-6cab42123da0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.110630 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.110668 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/522020cf-1556-4192-92c8-6cab42123da0-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.110680 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/522020cf-1556-4192-92c8-6cab42123da0-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.110692 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjrtd\" (UniqueName: \"kubernetes.io/projected/522020cf-1556-4192-92c8-6cab42123da0-kube-api-access-hjrtd\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.208985 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.587706 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.587780 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.725134 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.796987 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vwzff"] Mar 12 18:25:17 crc kubenswrapper[4926]: E0312 18:25:17.797587 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-api" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.797615 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-api" Mar 12 18:25:17 crc kubenswrapper[4926]: E0312 18:25:17.797638 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-log" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.797649 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-log" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.797969 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-log" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.798013 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="522020cf-1556-4192-92c8-6cab42123da0" containerName="nova-api-api" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.798763 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.801869 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.802416 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.810936 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vwzff"] Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.868582 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.872525 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3","Type":"ContainerStarted","Data":"1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177"} Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.898800 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8987822640000003 podStartE2EDuration="2.898782264s" podCreationTimestamp="2026-03-12 18:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:17.889614736 +0000 UTC m=+1358.258241079" watchObservedRunningTime="2026-03-12 18:25:17.898782264 +0000 UTC m=+1358.267408597" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.937724 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrrql\" (UniqueName: \"kubernetes.io/projected/76a36d0b-fd08-4c93-846c-688c71055113-kube-api-access-nrrql\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.937792 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-config-data\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.938027 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-scripts\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.938127 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.939635 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.963564 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.980082 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.983039 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.986279 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 18:25:17 crc kubenswrapper[4926]: I0312 18:25:17.991269 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040576 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5mxb\" (UniqueName: \"kubernetes.io/projected/ff10ffb2-f421-4a3a-aafd-5b081df5d109-kube-api-access-r5mxb\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040626 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrrql\" (UniqueName: \"kubernetes.io/projected/76a36d0b-fd08-4c93-846c-688c71055113-kube-api-access-nrrql\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040674 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-config-data\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040703 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff10ffb2-f421-4a3a-aafd-5b081df5d109-logs\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040719 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-config-data\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040734 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040779 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-scripts\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.040800 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.046904 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-config-data\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.048009 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-scripts\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.049756 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.058107 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrrql\" (UniqueName: \"kubernetes.io/projected/76a36d0b-fd08-4c93-846c-688c71055113-kube-api-access-nrrql\") pod \"nova-cell1-cell-mapping-vwzff\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.125509 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.142972 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff10ffb2-f421-4a3a-aafd-5b081df5d109-logs\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.143326 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-config-data\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.143342 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.143456 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff10ffb2-f421-4a3a-aafd-5b081df5d109-logs\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.143490 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5mxb\" (UniqueName: \"kubernetes.io/projected/ff10ffb2-f421-4a3a-aafd-5b081df5d109-kube-api-access-r5mxb\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.148936 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.149840 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-config-data\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.161626 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5mxb\" (UniqueName: \"kubernetes.io/projected/ff10ffb2-f421-4a3a-aafd-5b081df5d109-kube-api-access-r5mxb\") pod \"nova-api-0\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.309153 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.534105 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="522020cf-1556-4192-92c8-6cab42123da0" path="/var/lib/kubelet/pods/522020cf-1556-4192-92c8-6cab42123da0/volumes" Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.673898 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vwzff"] Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.868994 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:18 crc kubenswrapper[4926]: W0312 18:25:18.875385 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff10ffb2_f421_4a3a_aafd_5b081df5d109.slice/crio-daa31bd5c5f69662abb20c63efc93566505de27536b89db47ab73ffecd04929a WatchSource:0}: Error finding container daa31bd5c5f69662abb20c63efc93566505de27536b89db47ab73ffecd04929a: Status 404 returned error can't find the container with id daa31bd5c5f69662abb20c63efc93566505de27536b89db47ab73ffecd04929a Mar 12 18:25:18 crc kubenswrapper[4926]: I0312 18:25:18.877559 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vwzff" event={"ID":"76a36d0b-fd08-4c93-846c-688c71055113","Type":"ContainerStarted","Data":"a7717c4f2567107bbbfa3a1e3937a74f1ee5564e6be712ecb32bfbf5d5d798bb"} Mar 12 18:25:19 crc kubenswrapper[4926]: I0312 18:25:19.889719 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vwzff" event={"ID":"76a36d0b-fd08-4c93-846c-688c71055113","Type":"ContainerStarted","Data":"ddcfcec936ed5e6887603a245e014b4f9336991c54bce882c0b7657f1d5a318b"} Mar 12 18:25:19 crc kubenswrapper[4926]: I0312 18:25:19.892782 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff10ffb2-f421-4a3a-aafd-5b081df5d109","Type":"ContainerStarted","Data":"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8"} Mar 12 18:25:19 crc kubenswrapper[4926]: I0312 18:25:19.892859 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff10ffb2-f421-4a3a-aafd-5b081df5d109","Type":"ContainerStarted","Data":"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728"} Mar 12 18:25:19 crc kubenswrapper[4926]: I0312 18:25:19.892878 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff10ffb2-f421-4a3a-aafd-5b081df5d109","Type":"ContainerStarted","Data":"daa31bd5c5f69662abb20c63efc93566505de27536b89db47ab73ffecd04929a"} Mar 12 18:25:19 crc kubenswrapper[4926]: I0312 18:25:19.919485 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vwzff" podStartSLOduration=2.9194579000000003 podStartE2EDuration="2.9194579s" podCreationTimestamp="2026-03-12 18:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:19.91528595 +0000 UTC m=+1360.283912303" watchObservedRunningTime="2026-03-12 18:25:19.9194579 +0000 UTC m=+1360.288084243" Mar 12 18:25:19 crc kubenswrapper[4926]: I0312 18:25:19.942423 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.94240352 podStartE2EDuration="2.94240352s" podCreationTimestamp="2026-03-12 18:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:19.936678451 +0000 UTC m=+1360.305304784" watchObservedRunningTime="2026-03-12 18:25:19.94240352 +0000 UTC m=+1360.311029853" Mar 12 18:25:21 crc kubenswrapper[4926]: I0312 18:25:21.253948 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.552047 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.552492 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3343d19e-07d3-4de8-954a-f7e31aa8279f" containerName="kube-state-metrics" containerID="cri-o://c936fa8962854c1d8665ba026930a8b325915da3444b99cba8eeb7debd2bb042" gracePeriod=30 Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.588037 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.588128 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.702023 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="3343d19e-07d3-4de8-954a-f7e31aa8279f" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": dial tcp 10.217.0.106:8081: connect: connection refused" Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.927105 4926 generic.go:334] "Generic (PLEG): container finished" podID="3343d19e-07d3-4de8-954a-f7e31aa8279f" containerID="c936fa8962854c1d8665ba026930a8b325915da3444b99cba8eeb7debd2bb042" exitCode=2 Mar 12 18:25:22 crc kubenswrapper[4926]: I0312 18:25:22.927425 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3343d19e-07d3-4de8-954a-f7e31aa8279f","Type":"ContainerDied","Data":"c936fa8962854c1d8665ba026930a8b325915da3444b99cba8eeb7debd2bb042"} Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.047080 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.143835 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bjbq\" (UniqueName: \"kubernetes.io/projected/3343d19e-07d3-4de8-954a-f7e31aa8279f-kube-api-access-6bjbq\") pod \"3343d19e-07d3-4de8-954a-f7e31aa8279f\" (UID: \"3343d19e-07d3-4de8-954a-f7e31aa8279f\") " Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.164751 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3343d19e-07d3-4de8-954a-f7e31aa8279f-kube-api-access-6bjbq" (OuterVolumeSpecName: "kube-api-access-6bjbq") pod "3343d19e-07d3-4de8-954a-f7e31aa8279f" (UID: "3343d19e-07d3-4de8-954a-f7e31aa8279f"). InnerVolumeSpecName "kube-api-access-6bjbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.245716 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bjbq\" (UniqueName: \"kubernetes.io/projected/3343d19e-07d3-4de8-954a-f7e31aa8279f-kube-api-access-6bjbq\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.601693 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.601709 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.952140 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.952155 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3343d19e-07d3-4de8-954a-f7e31aa8279f","Type":"ContainerDied","Data":"a5a3fb3c5e3c756132a6d36939e2dc50de833c191b34382d6e423a84d04ecf45"} Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.952505 4926 scope.go:117] "RemoveContainer" containerID="c936fa8962854c1d8665ba026930a8b325915da3444b99cba8eeb7debd2bb042" Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.956781 4926 generic.go:334] "Generic (PLEG): container finished" podID="76a36d0b-fd08-4c93-846c-688c71055113" containerID="ddcfcec936ed5e6887603a245e014b4f9336991c54bce882c0b7657f1d5a318b" exitCode=0 Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.956834 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vwzff" event={"ID":"76a36d0b-fd08-4c93-846c-688c71055113","Type":"ContainerDied","Data":"ddcfcec936ed5e6887603a245e014b4f9336991c54bce882c0b7657f1d5a318b"} Mar 12 18:25:23 crc kubenswrapper[4926]: I0312 18:25:23.998647 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.013055 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.025937 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:25:24 crc kubenswrapper[4926]: E0312 18:25:24.026422 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3343d19e-07d3-4de8-954a-f7e31aa8279f" containerName="kube-state-metrics" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.026462 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343d19e-07d3-4de8-954a-f7e31aa8279f" containerName="kube-state-metrics" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.026804 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="3343d19e-07d3-4de8-954a-f7e31aa8279f" containerName="kube-state-metrics" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.027755 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.033375 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.033858 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.035509 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.059713 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.060001 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcclx\" (UniqueName: \"kubernetes.io/projected/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-api-access-dcclx\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.060163 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.060351 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.162124 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.162747 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcclx\" (UniqueName: \"kubernetes.io/projected/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-api-access-dcclx\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.162925 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.163101 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.167859 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.168588 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.170267 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.197240 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcclx\" (UniqueName: \"kubernetes.io/projected/30b7fb7c-fcab-4551-8284-c0dab53beb21-kube-api-access-dcclx\") pod \"kube-state-metrics-0\" (UID: \"30b7fb7c-fcab-4551-8284-c0dab53beb21\") " pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.375774 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.471921 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.472328 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="proxy-httpd" containerID="cri-o://271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0" gracePeriod=30 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.472351 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="sg-core" containerID="cri-o://26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b" gracePeriod=30 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.472381 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-central-agent" containerID="cri-o://5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d" gracePeriod=30 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.472353 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-notification-agent" containerID="cri-o://5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d" gracePeriod=30 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.512372 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3343d19e-07d3-4de8-954a-f7e31aa8279f" path="/var/lib/kubelet/pods/3343d19e-07d3-4de8-954a-f7e31aa8279f/volumes" Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.896977 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 18:25:24 crc kubenswrapper[4926]: W0312 18:25:24.912893 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30b7fb7c_fcab_4551_8284_c0dab53beb21.slice/crio-64ca1401b7b6b47ce2ad73b6c528c5b95d0557b055354e78739d6cfd6228b3af WatchSource:0}: Error finding container 64ca1401b7b6b47ce2ad73b6c528c5b95d0557b055354e78739d6cfd6228b3af: Status 404 returned error can't find the container with id 64ca1401b7b6b47ce2ad73b6c528c5b95d0557b055354e78739d6cfd6228b3af Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.966376 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30b7fb7c-fcab-4551-8284-c0dab53beb21","Type":"ContainerStarted","Data":"64ca1401b7b6b47ce2ad73b6c528c5b95d0557b055354e78739d6cfd6228b3af"} Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.970779 4926 generic.go:334] "Generic (PLEG): container finished" podID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerID="271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0" exitCode=0 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.970803 4926 generic.go:334] "Generic (PLEG): container finished" podID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerID="26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b" exitCode=2 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.970811 4926 generic.go:334] "Generic (PLEG): container finished" podID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerID="5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d" exitCode=0 Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.970829 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerDied","Data":"271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0"} Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.970878 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerDied","Data":"26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b"} Mar 12 18:25:24 crc kubenswrapper[4926]: I0312 18:25:24.970891 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerDied","Data":"5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d"} Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.475596 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.592466 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-config-data\") pod \"76a36d0b-fd08-4c93-846c-688c71055113\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.592574 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-scripts\") pod \"76a36d0b-fd08-4c93-846c-688c71055113\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.592629 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-combined-ca-bundle\") pod \"76a36d0b-fd08-4c93-846c-688c71055113\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.592699 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrrql\" (UniqueName: \"kubernetes.io/projected/76a36d0b-fd08-4c93-846c-688c71055113-kube-api-access-nrrql\") pod \"76a36d0b-fd08-4c93-846c-688c71055113\" (UID: \"76a36d0b-fd08-4c93-846c-688c71055113\") " Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.596598 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-scripts" (OuterVolumeSpecName: "scripts") pod "76a36d0b-fd08-4c93-846c-688c71055113" (UID: "76a36d0b-fd08-4c93-846c-688c71055113"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.597272 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a36d0b-fd08-4c93-846c-688c71055113-kube-api-access-nrrql" (OuterVolumeSpecName: "kube-api-access-nrrql") pod "76a36d0b-fd08-4c93-846c-688c71055113" (UID: "76a36d0b-fd08-4c93-846c-688c71055113"). InnerVolumeSpecName "kube-api-access-nrrql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.623692 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-config-data" (OuterVolumeSpecName: "config-data") pod "76a36d0b-fd08-4c93-846c-688c71055113" (UID: "76a36d0b-fd08-4c93-846c-688c71055113"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.626713 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76a36d0b-fd08-4c93-846c-688c71055113" (UID: "76a36d0b-fd08-4c93-846c-688c71055113"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.694848 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.694881 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.694890 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a36d0b-fd08-4c93-846c-688c71055113-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.694901 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrrql\" (UniqueName: \"kubernetes.io/projected/76a36d0b-fd08-4c93-846c-688c71055113-kube-api-access-nrrql\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.982270 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vwzff" event={"ID":"76a36d0b-fd08-4c93-846c-688c71055113","Type":"ContainerDied","Data":"a7717c4f2567107bbbfa3a1e3937a74f1ee5564e6be712ecb32bfbf5d5d798bb"} Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.982311 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7717c4f2567107bbbfa3a1e3937a74f1ee5564e6be712ecb32bfbf5d5d798bb" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.982520 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vwzff" Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.984353 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30b7fb7c-fcab-4551-8284-c0dab53beb21","Type":"ContainerStarted","Data":"5a30176409a7d48bc579f1ad02423c28ddc46894468c7dbc8bf6f21ccb4c8522"} Mar 12 18:25:25 crc kubenswrapper[4926]: I0312 18:25:25.984490 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.021363 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.614543336 podStartE2EDuration="3.021343721s" podCreationTimestamp="2026-03-12 18:25:23 +0000 UTC" firstStartedPulling="2026-03-12 18:25:24.916228443 +0000 UTC m=+1365.284854776" lastFinishedPulling="2026-03-12 18:25:25.323028828 +0000 UTC m=+1365.691655161" observedRunningTime="2026-03-12 18:25:26.005275056 +0000 UTC m=+1366.373901389" watchObservedRunningTime="2026-03-12 18:25:26.021343721 +0000 UTC m=+1366.389970064" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.166585 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.167131 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-log" containerID="cri-o://29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728" gracePeriod=30 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.167167 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-api" containerID="cri-o://d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8" gracePeriod=30 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.190566 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.190835 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" containerName="nova-scheduler-scheduler" containerID="cri-o://1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177" gracePeriod=30 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.241034 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.248157 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-log" containerID="cri-o://b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28" gracePeriod=30 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.248333 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-metadata" containerID="cri-o://fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6" gracePeriod=30 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.714604 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.830379 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-combined-ca-bundle\") pod \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.830508 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5mxb\" (UniqueName: \"kubernetes.io/projected/ff10ffb2-f421-4a3a-aafd-5b081df5d109-kube-api-access-r5mxb\") pod \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.830675 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-config-data\") pod \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.830709 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff10ffb2-f421-4a3a-aafd-5b081df5d109-logs\") pod \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\" (UID: \"ff10ffb2-f421-4a3a-aafd-5b081df5d109\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.833607 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff10ffb2-f421-4a3a-aafd-5b081df5d109-logs" (OuterVolumeSpecName: "logs") pod "ff10ffb2-f421-4a3a-aafd-5b081df5d109" (UID: "ff10ffb2-f421-4a3a-aafd-5b081df5d109"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.839653 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff10ffb2-f421-4a3a-aafd-5b081df5d109-kube-api-access-r5mxb" (OuterVolumeSpecName: "kube-api-access-r5mxb") pod "ff10ffb2-f421-4a3a-aafd-5b081df5d109" (UID: "ff10ffb2-f421-4a3a-aafd-5b081df5d109"). InnerVolumeSpecName "kube-api-access-r5mxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.845781 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.859833 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff10ffb2-f421-4a3a-aafd-5b081df5d109" (UID: "ff10ffb2-f421-4a3a-aafd-5b081df5d109"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.861012 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-config-data" (OuterVolumeSpecName: "config-data") pod "ff10ffb2-f421-4a3a-aafd-5b081df5d109" (UID: "ff10ffb2-f421-4a3a-aafd-5b081df5d109"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932130 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-combined-ca-bundle\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932239 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-log-httpd\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932313 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-config-data\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932424 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbct5\" (UniqueName: \"kubernetes.io/projected/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-kube-api-access-dbct5\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932486 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-scripts\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932522 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-sg-core-conf-yaml\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932562 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-run-httpd\") pod \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\" (UID: \"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d\") " Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.932819 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933053 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933737 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933760 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff10ffb2-f421-4a3a-aafd-5b081df5d109-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933769 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff10ffb2-f421-4a3a-aafd-5b081df5d109-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933779 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933792 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5mxb\" (UniqueName: \"kubernetes.io/projected/ff10ffb2-f421-4a3a-aafd-5b081df5d109-kube-api-access-r5mxb\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.933802 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.937043 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-scripts" (OuterVolumeSpecName: "scripts") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.937114 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-kube-api-access-dbct5" (OuterVolumeSpecName: "kube-api-access-dbct5") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "kube-api-access-dbct5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.980144 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.994295 4926 generic.go:334] "Generic (PLEG): container finished" podID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerID="5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d" exitCode=0 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.994350 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerDied","Data":"5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d"} Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.994374 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d80b1d3-3647-44fe-9a41-f1a6c11aff1d","Type":"ContainerDied","Data":"ab0418481e92c6856771c7ea5c23c736e2ec58cc43ca849b59c1411888deb2c2"} Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.994391 4926 scope.go:117] "RemoveContainer" containerID="271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.994571 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.999359 4926 generic.go:334] "Generic (PLEG): container finished" podID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerID="d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8" exitCode=0 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.999396 4926 generic.go:334] "Generic (PLEG): container finished" podID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerID="29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728" exitCode=143 Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.999447 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.999473 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff10ffb2-f421-4a3a-aafd-5b081df5d109","Type":"ContainerDied","Data":"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8"} Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.999500 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff10ffb2-f421-4a3a-aafd-5b081df5d109","Type":"ContainerDied","Data":"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728"} Mar 12 18:25:26 crc kubenswrapper[4926]: I0312 18:25:26.999509 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff10ffb2-f421-4a3a-aafd-5b081df5d109","Type":"ContainerDied","Data":"daa31bd5c5f69662abb20c63efc93566505de27536b89db47ab73ffecd04929a"} Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.001388 4926 generic.go:334] "Generic (PLEG): container finished" podID="f6b42e45-6f11-4207-b26d-7befa423860f" containerID="b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28" exitCode=143 Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.001470 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6b42e45-6f11-4207-b26d-7befa423860f","Type":"ContainerDied","Data":"b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28"} Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.036174 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbct5\" (UniqueName: \"kubernetes.io/projected/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-kube-api-access-dbct5\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.036205 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.036214 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.040504 4926 scope.go:117] "RemoveContainer" containerID="26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.042157 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.054501 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.055714 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.062867 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-config-data" (OuterVolumeSpecName: "config-data") pod "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" (UID: "6d80b1d3-3647-44fe-9a41-f1a6c11aff1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.070569 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071076 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-log" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071098 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-log" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071119 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-central-agent" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071127 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-central-agent" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071142 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-api" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071149 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-api" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071164 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="proxy-httpd" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071171 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="proxy-httpd" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071188 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-notification-agent" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071295 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-notification-agent" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071318 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="sg-core" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071325 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="sg-core" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.071338 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a36d0b-fd08-4c93-846c-688c71055113" containerName="nova-manage" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.071345 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a36d0b-fd08-4c93-846c-688c71055113" containerName="nova-manage" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072271 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a36d0b-fd08-4c93-846c-688c71055113" containerName="nova-manage" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072294 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="proxy-httpd" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072314 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-central-agent" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072325 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="ceilometer-notification-agent" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072337 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-api" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072351 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" containerName="nova-api-log" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.072362 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" containerName="sg-core" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.074655 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.078005 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.080009 4926 scope.go:117] "RemoveContainer" containerID="5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.101899 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.112501 4926 scope.go:117] "RemoveContainer" containerID="5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137548 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-config-data\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137594 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137619 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-logs\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137690 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrb4\" (UniqueName: \"kubernetes.io/projected/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-kube-api-access-qsrb4\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137733 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137744 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.137731 4926 scope.go:117] "RemoveContainer" containerID="271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.138212 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0\": container with ID starting with 271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0 not found: ID does not exist" containerID="271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.138273 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0"} err="failed to get container status \"271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0\": rpc error: code = NotFound desc = could not find container \"271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0\": container with ID starting with 271aa5ed045ae8594c9cbb880d3c37153215115447cba035c80a461aca09cea0 not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.138292 4926 scope.go:117] "RemoveContainer" containerID="26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.138575 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b\": container with ID starting with 26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b not found: ID does not exist" containerID="26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.138608 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b"} err="failed to get container status \"26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b\": rpc error: code = NotFound desc = could not find container \"26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b\": container with ID starting with 26aaa0ccefcdb9eb3bdf054f4109a5704267b244c20325cae1cd1d7a964dc43b not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.138629 4926 scope.go:117] "RemoveContainer" containerID="5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.138833 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d\": container with ID starting with 5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d not found: ID does not exist" containerID="5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.138862 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d"} err="failed to get container status \"5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d\": rpc error: code = NotFound desc = could not find container \"5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d\": container with ID starting with 5133e83018172bddb2651e87d19c3c584b880343960493ab0fec8d492083c66d not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.138878 4926 scope.go:117] "RemoveContainer" containerID="5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.139065 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d\": container with ID starting with 5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d not found: ID does not exist" containerID="5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.139088 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d"} err="failed to get container status \"5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d\": rpc error: code = NotFound desc = could not find container \"5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d\": container with ID starting with 5a7b4815509ad4aad6640591fd715ed5b59dfabe9c09852413d0e4c6fe85885d not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.139151 4926 scope.go:117] "RemoveContainer" containerID="d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.155520 4926 scope.go:117] "RemoveContainer" containerID="29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.185559 4926 scope.go:117] "RemoveContainer" containerID="d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.185996 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8\": container with ID starting with d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8 not found: ID does not exist" containerID="d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186019 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8"} err="failed to get container status \"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8\": rpc error: code = NotFound desc = could not find container \"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8\": container with ID starting with d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8 not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186038 4926 scope.go:117] "RemoveContainer" containerID="29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728" Mar 12 18:25:27 crc kubenswrapper[4926]: E0312 18:25:27.186370 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728\": container with ID starting with 29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728 not found: ID does not exist" containerID="29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186385 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728"} err="failed to get container status \"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728\": rpc error: code = NotFound desc = could not find container \"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728\": container with ID starting with 29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728 not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186399 4926 scope.go:117] "RemoveContainer" containerID="d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186612 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8"} err="failed to get container status \"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8\": rpc error: code = NotFound desc = could not find container \"d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8\": container with ID starting with d65cf4c963918ec2d4ea410d0b00104c86b16edaf457c2b52b271c52720055c8 not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186649 4926 scope.go:117] "RemoveContainer" containerID="29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.186961 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728"} err="failed to get container status \"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728\": rpc error: code = NotFound desc = could not find container \"29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728\": container with ID starting with 29b49f9130da1335dccc5243b155c7585a61ed1e763af8bc4996db2dc9990728 not found: ID does not exist" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.239701 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-config-data\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.239752 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.239780 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-logs\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.240128 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrb4\" (UniqueName: \"kubernetes.io/projected/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-kube-api-access-qsrb4\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.240624 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-logs\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.243813 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-config-data\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.244265 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.256650 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrb4\" (UniqueName: \"kubernetes.io/projected/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-kube-api-access-qsrb4\") pod \"nova-api-0\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.357780 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.368021 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.379083 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.381708 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.384296 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.385236 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.394168 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.406606 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.411386 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.453178 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.453237 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.453330 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-scripts\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.453355 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-log-httpd\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.453382 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.453516 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-config-data\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.454824 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh6lh\" (UniqueName: \"kubernetes.io/projected/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-kube-api-access-hh6lh\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.454884 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-run-httpd\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.557887 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558627 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-scripts\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558663 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-log-httpd\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558686 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558741 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-config-data\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558795 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh6lh\" (UniqueName: \"kubernetes.io/projected/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-kube-api-access-hh6lh\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558826 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-run-httpd\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.558866 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.559862 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-log-httpd\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.560289 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-run-httpd\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.564360 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.565213 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-config-data\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.565692 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-scripts\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.566243 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.569006 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.578152 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh6lh\" (UniqueName: \"kubernetes.io/projected/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-kube-api-access-hh6lh\") pod \"ceilometer-0\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.706507 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:27 crc kubenswrapper[4926]: I0312 18:25:27.890136 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:27 crc kubenswrapper[4926]: W0312 18:25:27.896284 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd37e2fa1_9044_4dfb_9aea_bcb441cc91a4.slice/crio-e038299ff9160ecc8a030d680499ae9a217c62acd0df47033360941b4347d3a1 WatchSource:0}: Error finding container e038299ff9160ecc8a030d680499ae9a217c62acd0df47033360941b4347d3a1: Status 404 returned error can't find the container with id e038299ff9160ecc8a030d680499ae9a217c62acd0df47033360941b4347d3a1 Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.020260 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4","Type":"ContainerStarted","Data":"e038299ff9160ecc8a030d680499ae9a217c62acd0df47033360941b4347d3a1"} Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.164579 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:28 crc kubenswrapper[4926]: W0312 18:25:28.165310 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccd8b5fe_8294_40c2_a6f4_3b82012686ec.slice/crio-da79f6ead62afe8afb991cfd28d34557304f0f537134c344d4948f2cb4bfd386 WatchSource:0}: Error finding container da79f6ead62afe8afb991cfd28d34557304f0f537134c344d4948f2cb4bfd386: Status 404 returned error can't find the container with id da79f6ead62afe8afb991cfd28d34557304f0f537134c344d4948f2cb4bfd386 Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.504369 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d80b1d3-3647-44fe-9a41-f1a6c11aff1d" path="/var/lib/kubelet/pods/6d80b1d3-3647-44fe-9a41-f1a6c11aff1d/volumes" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.505799 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff10ffb2-f421-4a3a-aafd-5b081df5d109" path="/var/lib/kubelet/pods/ff10ffb2-f421-4a3a-aafd-5b081df5d109/volumes" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.765569 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.885759 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xl7g\" (UniqueName: \"kubernetes.io/projected/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-kube-api-access-4xl7g\") pod \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.885874 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-combined-ca-bundle\") pod \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.886251 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-config-data\") pod \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\" (UID: \"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3\") " Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.890342 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-kube-api-access-4xl7g" (OuterVolumeSpecName: "kube-api-access-4xl7g") pod "e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" (UID: "e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3"). InnerVolumeSpecName "kube-api-access-4xl7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.917882 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-config-data" (OuterVolumeSpecName: "config-data") pod "e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" (UID: "e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.920800 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" (UID: "e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.988752 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.988802 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:28 crc kubenswrapper[4926]: I0312 18:25:28.988818 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xl7g\" (UniqueName: \"kubernetes.io/projected/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3-kube-api-access-4xl7g\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.030362 4926 generic.go:334] "Generic (PLEG): container finished" podID="e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" containerID="1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177" exitCode=0 Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.030471 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3","Type":"ContainerDied","Data":"1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177"} Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.030510 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3","Type":"ContainerDied","Data":"7a7d155b546588f16ab2cdd1af883265350417efed1c7aef091311c1c7d02134"} Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.030537 4926 scope.go:117] "RemoveContainer" containerID="1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.030709 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.035992 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerStarted","Data":"00d08df89cba0281fd578a80d34eeb606178fe179a18e0bb4dca41d34fcaae6c"} Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.036042 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerStarted","Data":"da79f6ead62afe8afb991cfd28d34557304f0f537134c344d4948f2cb4bfd386"} Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.038170 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4","Type":"ContainerStarted","Data":"6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c"} Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.038202 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4","Type":"ContainerStarted","Data":"a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12"} Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.060667 4926 scope.go:117] "RemoveContainer" containerID="1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177" Mar 12 18:25:29 crc kubenswrapper[4926]: E0312 18:25:29.061097 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177\": container with ID starting with 1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177 not found: ID does not exist" containerID="1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.061132 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177"} err="failed to get container status \"1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177\": rpc error: code = NotFound desc = could not find container \"1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177\": container with ID starting with 1c8ef3a70608e2a8423d2b9d25faab813e85066f6ede6071e20748773e377177 not found: ID does not exist" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.067914 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.067893098 podStartE2EDuration="2.067893098s" podCreationTimestamp="2026-03-12 18:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:29.054454606 +0000 UTC m=+1369.423080969" watchObservedRunningTime="2026-03-12 18:25:29.067893098 +0000 UTC m=+1369.436519431" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.094213 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.130426 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.149158 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:29 crc kubenswrapper[4926]: E0312 18:25:29.149631 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" containerName="nova-scheduler-scheduler" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.149654 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" containerName="nova-scheduler-scheduler" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.149854 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" containerName="nova-scheduler-scheduler" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.150456 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.152265 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.160951 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.192963 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9c2983-b118-4677-ba18-20531d4223ad-config-data\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.193053 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9c2983-b118-4677-ba18-20531d4223ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.193129 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrfd\" (UniqueName: \"kubernetes.io/projected/6d9c2983-b118-4677-ba18-20531d4223ad-kube-api-access-ffrfd\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.294663 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffrfd\" (UniqueName: \"kubernetes.io/projected/6d9c2983-b118-4677-ba18-20531d4223ad-kube-api-access-ffrfd\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.294752 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9c2983-b118-4677-ba18-20531d4223ad-config-data\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.294820 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9c2983-b118-4677-ba18-20531d4223ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.299172 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9c2983-b118-4677-ba18-20531d4223ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.304032 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9c2983-b118-4677-ba18-20531d4223ad-config-data\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.312286 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffrfd\" (UniqueName: \"kubernetes.io/projected/6d9c2983-b118-4677-ba18-20531d4223ad-kube-api-access-ffrfd\") pod \"nova-scheduler-0\" (UID: \"6d9c2983-b118-4677-ba18-20531d4223ad\") " pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.520270 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.819295 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.903714 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-combined-ca-bundle\") pod \"f6b42e45-6f11-4207-b26d-7befa423860f\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.903793 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4zmx\" (UniqueName: \"kubernetes.io/projected/f6b42e45-6f11-4207-b26d-7befa423860f-kube-api-access-b4zmx\") pod \"f6b42e45-6f11-4207-b26d-7befa423860f\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.903871 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-config-data\") pod \"f6b42e45-6f11-4207-b26d-7befa423860f\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.903912 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b42e45-6f11-4207-b26d-7befa423860f-logs\") pod \"f6b42e45-6f11-4207-b26d-7befa423860f\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.903964 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-nova-metadata-tls-certs\") pod \"f6b42e45-6f11-4207-b26d-7befa423860f\" (UID: \"f6b42e45-6f11-4207-b26d-7befa423860f\") " Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.909301 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b42e45-6f11-4207-b26d-7befa423860f-kube-api-access-b4zmx" (OuterVolumeSpecName: "kube-api-access-b4zmx") pod "f6b42e45-6f11-4207-b26d-7befa423860f" (UID: "f6b42e45-6f11-4207-b26d-7befa423860f"). InnerVolumeSpecName "kube-api-access-b4zmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.909777 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b42e45-6f11-4207-b26d-7befa423860f-logs" (OuterVolumeSpecName: "logs") pod "f6b42e45-6f11-4207-b26d-7befa423860f" (UID: "f6b42e45-6f11-4207-b26d-7befa423860f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.930094 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6b42e45-6f11-4207-b26d-7befa423860f" (UID: "f6b42e45-6f11-4207-b26d-7befa423860f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.930708 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-config-data" (OuterVolumeSpecName: "config-data") pod "f6b42e45-6f11-4207-b26d-7befa423860f" (UID: "f6b42e45-6f11-4207-b26d-7befa423860f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:29 crc kubenswrapper[4926]: I0312 18:25:29.946848 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f6b42e45-6f11-4207-b26d-7befa423860f" (UID: "f6b42e45-6f11-4207-b26d-7befa423860f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.006422 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.006466 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4zmx\" (UniqueName: \"kubernetes.io/projected/f6b42e45-6f11-4207-b26d-7befa423860f-kube-api-access-b4zmx\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.006476 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.006485 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6b42e45-6f11-4207-b26d-7befa423860f-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.006494 4926 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6b42e45-6f11-4207-b26d-7befa423860f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.037784 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.047695 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerStarted","Data":"22a2d1e8587226f797d924ae278b1c0e7b846e0cc05d9f335cbb31ac04af937c"} Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.052267 4926 generic.go:334] "Generic (PLEG): container finished" podID="f6b42e45-6f11-4207-b26d-7befa423860f" containerID="fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6" exitCode=0 Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.052339 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6b42e45-6f11-4207-b26d-7befa423860f","Type":"ContainerDied","Data":"fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6"} Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.052371 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6b42e45-6f11-4207-b26d-7befa423860f","Type":"ContainerDied","Data":"0161e0bff07323bc954df2ec0573f151d9db197c8f01a596b3666178a9a120fe"} Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.052394 4926 scope.go:117] "RemoveContainer" containerID="fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.052575 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.098632 4926 scope.go:117] "RemoveContainer" containerID="b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.112053 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.126283 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.138279 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:30 crc kubenswrapper[4926]: E0312 18:25:30.139185 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-log" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.139214 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-log" Mar 12 18:25:30 crc kubenswrapper[4926]: E0312 18:25:30.139267 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-metadata" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.139277 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-metadata" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.139736 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-log" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.139774 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" containerName="nova-metadata-metadata" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.146970 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.150901 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.151126 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.180491 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.198322 4926 scope.go:117] "RemoveContainer" containerID="fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6" Mar 12 18:25:30 crc kubenswrapper[4926]: E0312 18:25:30.199222 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6\": container with ID starting with fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6 not found: ID does not exist" containerID="fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.199271 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6"} err="failed to get container status \"fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6\": rpc error: code = NotFound desc = could not find container \"fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6\": container with ID starting with fb487b53c7739924b7925ad02cd2780113be299a6f5b88e0215775a40ccaf8b6 not found: ID does not exist" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.199297 4926 scope.go:117] "RemoveContainer" containerID="b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28" Mar 12 18:25:30 crc kubenswrapper[4926]: E0312 18:25:30.199617 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28\": container with ID starting with b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28 not found: ID does not exist" containerID="b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.199643 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28"} err="failed to get container status \"b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28\": rpc error: code = NotFound desc = could not find container \"b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28\": container with ID starting with b0f9e215a663c32f91df3902f5d9ab111e489e136457cd89dc0cefc519b52f28 not found: ID does not exist" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.209392 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8065d406-f127-4c57-b603-e7e6afeb3731-logs\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.209571 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.209761 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.209822 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbck\" (UniqueName: \"kubernetes.io/projected/8065d406-f127-4c57-b603-e7e6afeb3731-kube-api-access-rnbck\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.209853 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-config-data\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.310926 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.311040 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.311081 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbck\" (UniqueName: \"kubernetes.io/projected/8065d406-f127-4c57-b603-e7e6afeb3731-kube-api-access-rnbck\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.311108 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-config-data\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.311165 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8065d406-f127-4c57-b603-e7e6afeb3731-logs\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.311571 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8065d406-f127-4c57-b603-e7e6afeb3731-logs\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.319632 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.319939 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.320143 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8065d406-f127-4c57-b603-e7e6afeb3731-config-data\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.338459 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbck\" (UniqueName: \"kubernetes.io/projected/8065d406-f127-4c57-b603-e7e6afeb3731-kube-api-access-rnbck\") pod \"nova-metadata-0\" (UID: \"8065d406-f127-4c57-b603-e7e6afeb3731\") " pod="openstack/nova-metadata-0" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.507191 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3" path="/var/lib/kubelet/pods/e6e2c8ff-6e0f-4a2b-a6d5-e2c4cf9c97a3/volumes" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.507882 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b42e45-6f11-4207-b26d-7befa423860f" path="/var/lib/kubelet/pods/f6b42e45-6f11-4207-b26d-7befa423860f/volumes" Mar 12 18:25:30 crc kubenswrapper[4926]: I0312 18:25:30.587197 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 18:25:31 crc kubenswrapper[4926]: I0312 18:25:31.071148 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d9c2983-b118-4677-ba18-20531d4223ad","Type":"ContainerStarted","Data":"baa7c017e61a4be9bf8c8ad4d676a5457b6ae3959676b8e4e40e02afb89888e2"} Mar 12 18:25:31 crc kubenswrapper[4926]: I0312 18:25:31.071758 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d9c2983-b118-4677-ba18-20531d4223ad","Type":"ContainerStarted","Data":"4be6a5c77109442ca23bc3eb5c55dcc298e0a416b77d98df56f9405885ecfee6"} Mar 12 18:25:31 crc kubenswrapper[4926]: I0312 18:25:31.075482 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerStarted","Data":"e6bde295c7812497b3848b236f4e958ec3015f8ebeca1a4fdbd6ee1413805cb9"} Mar 12 18:25:31 crc kubenswrapper[4926]: I0312 18:25:31.117176 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.117158521 podStartE2EDuration="2.117158521s" podCreationTimestamp="2026-03-12 18:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:31.088993947 +0000 UTC m=+1371.457620290" watchObservedRunningTime="2026-03-12 18:25:31.117158521 +0000 UTC m=+1371.485784854" Mar 12 18:25:31 crc kubenswrapper[4926]: W0312 18:25:31.126581 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8065d406_f127_4c57_b603_e7e6afeb3731.slice/crio-eb3eb7fbcaf5e170aebb941bb0070f18800fda026662721aebf121b775e6664c WatchSource:0}: Error finding container eb3eb7fbcaf5e170aebb941bb0070f18800fda026662721aebf121b775e6664c: Status 404 returned error can't find the container with id eb3eb7fbcaf5e170aebb941bb0070f18800fda026662721aebf121b775e6664c Mar 12 18:25:31 crc kubenswrapper[4926]: I0312 18:25:31.133261 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 18:25:32 crc kubenswrapper[4926]: I0312 18:25:32.087887 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8065d406-f127-4c57-b603-e7e6afeb3731","Type":"ContainerStarted","Data":"1e13c7538c29aaae5324b5fbacc1cda6402f336dc0c05bf17ccebd26e5306b6d"} Mar 12 18:25:32 crc kubenswrapper[4926]: I0312 18:25:32.088252 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8065d406-f127-4c57-b603-e7e6afeb3731","Type":"ContainerStarted","Data":"7ef67231a2e8d527454e768f65df4f1e0eaa49d1bdd3fbdd9a31c26ce2bc8392"} Mar 12 18:25:32 crc kubenswrapper[4926]: I0312 18:25:32.088272 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8065d406-f127-4c57-b603-e7e6afeb3731","Type":"ContainerStarted","Data":"eb3eb7fbcaf5e170aebb941bb0070f18800fda026662721aebf121b775e6664c"} Mar 12 18:25:32 crc kubenswrapper[4926]: I0312 18:25:32.110589 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.110573024 podStartE2EDuration="2.110573024s" podCreationTimestamp="2026-03-12 18:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:32.10725809 +0000 UTC m=+1372.475884443" watchObservedRunningTime="2026-03-12 18:25:32.110573024 +0000 UTC m=+1372.479199357" Mar 12 18:25:34 crc kubenswrapper[4926]: I0312 18:25:34.107110 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerStarted","Data":"2d6e21accb5b365373d4e446af9a0d2f3bb681b5f74d1a1d1d5bbeb21b98d345"} Mar 12 18:25:34 crc kubenswrapper[4926]: I0312 18:25:34.107959 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:25:34 crc kubenswrapper[4926]: I0312 18:25:34.142464 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.863804424 podStartE2EDuration="7.142421601s" podCreationTimestamp="2026-03-12 18:25:27 +0000 UTC" firstStartedPulling="2026-03-12 18:25:28.167914057 +0000 UTC m=+1368.536540390" lastFinishedPulling="2026-03-12 18:25:33.446531234 +0000 UTC m=+1373.815157567" observedRunningTime="2026-03-12 18:25:34.133531852 +0000 UTC m=+1374.502158215" watchObservedRunningTime="2026-03-12 18:25:34.142421601 +0000 UTC m=+1374.511047934" Mar 12 18:25:34 crc kubenswrapper[4926]: I0312 18:25:34.387399 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 18:25:34 crc kubenswrapper[4926]: I0312 18:25:34.521636 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 18:25:35 crc kubenswrapper[4926]: I0312 18:25:35.588118 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:35 crc kubenswrapper[4926]: I0312 18:25:35.588199 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 18:25:37 crc kubenswrapper[4926]: I0312 18:25:37.409490 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 18:25:37 crc kubenswrapper[4926]: I0312 18:25:37.409829 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 18:25:38 crc kubenswrapper[4926]: I0312 18:25:38.498476 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:38 crc kubenswrapper[4926]: I0312 18:25:38.500032 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.209:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:39 crc kubenswrapper[4926]: I0312 18:25:39.521108 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 18:25:39 crc kubenswrapper[4926]: I0312 18:25:39.558850 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 18:25:40 crc kubenswrapper[4926]: I0312 18:25:40.214807 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 18:25:40 crc kubenswrapper[4926]: I0312 18:25:40.588412 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 18:25:40 crc kubenswrapper[4926]: I0312 18:25:40.589894 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 18:25:41 crc kubenswrapper[4926]: I0312 18:25:41.606644 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8065d406-f127-4c57-b603-e7e6afeb3731" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:41 crc kubenswrapper[4926]: I0312 18:25:41.606675 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8065d406-f127-4c57-b603-e7e6afeb3731" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 18:25:47 crc kubenswrapper[4926]: I0312 18:25:47.415006 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 18:25:47 crc kubenswrapper[4926]: I0312 18:25:47.416165 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 18:25:47 crc kubenswrapper[4926]: I0312 18:25:47.417200 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 18:25:47 crc kubenswrapper[4926]: I0312 18:25:47.427797 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.282713 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.288226 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.525515 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-nfj99"] Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.527069 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.537186 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-nfj99"] Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.603314 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.603370 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swntz\" (UniqueName: \"kubernetes.io/projected/99d97cdd-8bee-43b1-a07c-fee61fceff3a-kube-api-access-swntz\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.603514 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-config\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.603531 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.603604 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.603706 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.705855 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.705909 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swntz\" (UniqueName: \"kubernetes.io/projected/99d97cdd-8bee-43b1-a07c-fee61fceff3a-kube-api-access-swntz\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.705953 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-config\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.705971 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.706005 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.706042 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.706963 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.707370 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.707410 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-config\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.707529 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.707805 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.724038 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swntz\" (UniqueName: \"kubernetes.io/projected/99d97cdd-8bee-43b1-a07c-fee61fceff3a-kube-api-access-swntz\") pod \"dnsmasq-dns-89c5cd4d5-nfj99\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:48 crc kubenswrapper[4926]: I0312 18:25:48.858680 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:49 crc kubenswrapper[4926]: I0312 18:25:49.354553 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-nfj99"] Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.299346 4926 generic.go:334] "Generic (PLEG): container finished" podID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerID="30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0" exitCode=0 Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.300137 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" event={"ID":"99d97cdd-8bee-43b1-a07c-fee61fceff3a","Type":"ContainerDied","Data":"30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0"} Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.300185 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" event={"ID":"99d97cdd-8bee-43b1-a07c-fee61fceff3a","Type":"ContainerStarted","Data":"9711fea2cdf454dba0540ad56623bf62d55c2c749c2e71e1fe9a2ccbb9165424"} Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.472552 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.472819 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-central-agent" containerID="cri-o://00d08df89cba0281fd578a80d34eeb606178fe179a18e0bb4dca41d34fcaae6c" gracePeriod=30 Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.472953 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-notification-agent" containerID="cri-o://22a2d1e8587226f797d924ae278b1c0e7b846e0cc05d9f335cbb31ac04af937c" gracePeriod=30 Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.472954 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="sg-core" containerID="cri-o://e6bde295c7812497b3848b236f4e958ec3015f8ebeca1a4fdbd6ee1413805cb9" gracePeriod=30 Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.473016 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="proxy-httpd" containerID="cri-o://2d6e21accb5b365373d4e446af9a0d2f3bb681b5f74d1a1d1d5bbeb21b98d345" gracePeriod=30 Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.573727 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.210:3000/\": read tcp 10.217.0.2:42148->10.217.0.210:3000: read: connection reset by peer" Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.594180 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.596529 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 18:25:50 crc kubenswrapper[4926]: I0312 18:25:50.601536 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.004125 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.317278 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" event={"ID":"99d97cdd-8bee-43b1-a07c-fee61fceff3a","Type":"ContainerStarted","Data":"c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c"} Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.317643 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.320869 4926 generic.go:334] "Generic (PLEG): container finished" podID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerID="2d6e21accb5b365373d4e446af9a0d2f3bb681b5f74d1a1d1d5bbeb21b98d345" exitCode=0 Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.320893 4926 generic.go:334] "Generic (PLEG): container finished" podID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerID="e6bde295c7812497b3848b236f4e958ec3015f8ebeca1a4fdbd6ee1413805cb9" exitCode=2 Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.320901 4926 generic.go:334] "Generic (PLEG): container finished" podID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerID="22a2d1e8587226f797d924ae278b1c0e7b846e0cc05d9f335cbb31ac04af937c" exitCode=0 Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.320908 4926 generic.go:334] "Generic (PLEG): container finished" podID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerID="00d08df89cba0281fd578a80d34eeb606178fe179a18e0bb4dca41d34fcaae6c" exitCode=0 Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321072 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-log" containerID="cri-o://a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12" gracePeriod=30 Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321282 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerDied","Data":"2d6e21accb5b365373d4e446af9a0d2f3bb681b5f74d1a1d1d5bbeb21b98d345"} Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321330 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerDied","Data":"e6bde295c7812497b3848b236f4e958ec3015f8ebeca1a4fdbd6ee1413805cb9"} Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321340 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerDied","Data":"22a2d1e8587226f797d924ae278b1c0e7b846e0cc05d9f335cbb31ac04af937c"} Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321349 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerDied","Data":"00d08df89cba0281fd578a80d34eeb606178fe179a18e0bb4dca41d34fcaae6c"} Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321357 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ccd8b5fe-8294-40c2-a6f4-3b82012686ec","Type":"ContainerDied","Data":"da79f6ead62afe8afb991cfd28d34557304f0f537134c344d4948f2cb4bfd386"} Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.321366 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da79f6ead62afe8afb991cfd28d34557304f0f537134c344d4948f2cb4bfd386" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.322594 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-api" containerID="cri-o://6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c" gracePeriod=30 Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.332202 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.342014 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" podStartSLOduration=3.341997774 podStartE2EDuration="3.341997774s" podCreationTimestamp="2026-03-12 18:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:51.337871535 +0000 UTC m=+1391.706497868" watchObservedRunningTime="2026-03-12 18:25:51.341997774 +0000 UTC m=+1391.710624107" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.353969 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383135 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-ceilometer-tls-certs\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383218 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-combined-ca-bundle\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383283 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-run-httpd\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383305 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-sg-core-conf-yaml\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383323 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-log-httpd\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383347 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-scripts\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383370 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh6lh\" (UniqueName: \"kubernetes.io/projected/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-kube-api-access-hh6lh\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.383474 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-config-data\") pod \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\" (UID: \"ccd8b5fe-8294-40c2-a6f4-3b82012686ec\") " Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.395596 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.395675 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.398428 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-scripts" (OuterVolumeSpecName: "scripts") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.402883 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-kube-api-access-hh6lh" (OuterVolumeSpecName: "kube-api-access-hh6lh") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "kube-api-access-hh6lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.442917 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.461407 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.485455 4926 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.485686 4926 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.485775 4926 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.485863 4926 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.485944 4926 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.486024 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh6lh\" (UniqueName: \"kubernetes.io/projected/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-kube-api-access-hh6lh\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.497418 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.541195 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-config-data" (OuterVolumeSpecName: "config-data") pod "ccd8b5fe-8294-40c2-a6f4-3b82012686ec" (UID: "ccd8b5fe-8294-40c2-a6f4-3b82012686ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.587179 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:51 crc kubenswrapper[4926]: I0312 18:25:51.587209 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccd8b5fe-8294-40c2-a6f4-3b82012686ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.329698 4926 generic.go:334] "Generic (PLEG): container finished" podID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerID="a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12" exitCode=143 Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.329793 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4","Type":"ContainerDied","Data":"a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12"} Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.330042 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.361255 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.371243 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.392336 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:52 crc kubenswrapper[4926]: E0312 18:25:52.392819 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-notification-agent" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.392843 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-notification-agent" Mar 12 18:25:52 crc kubenswrapper[4926]: E0312 18:25:52.392863 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-central-agent" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.392872 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-central-agent" Mar 12 18:25:52 crc kubenswrapper[4926]: E0312 18:25:52.392896 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="proxy-httpd" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.392905 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="proxy-httpd" Mar 12 18:25:52 crc kubenswrapper[4926]: E0312 18:25:52.392935 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="sg-core" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.392944 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="sg-core" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.393163 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-central-agent" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.393177 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="sg-core" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.393193 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="proxy-httpd" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.393203 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" containerName="ceilometer-notification-agent" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.394997 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.397009 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.397536 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.397732 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.416839 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.500199 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd8b5fe-8294-40c2-a6f4-3b82012686ec" path="/var/lib/kubelet/pods/ccd8b5fe-8294-40c2-a6f4-3b82012686ec/volumes" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.504905 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.504969 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.505029 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.505169 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-scripts\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.505277 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f983b88e-aba3-4d49-bbd4-4db5eef5266c-run-httpd\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.505320 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f983b88e-aba3-4d49-bbd4-4db5eef5266c-log-httpd\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.505341 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-config-data\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.505398 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fb7\" (UniqueName: \"kubernetes.io/projected/f983b88e-aba3-4d49-bbd4-4db5eef5266c-kube-api-access-h6fb7\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607099 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-scripts\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607186 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f983b88e-aba3-4d49-bbd4-4db5eef5266c-run-httpd\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607221 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f983b88e-aba3-4d49-bbd4-4db5eef5266c-log-httpd\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607236 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-config-data\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607267 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fb7\" (UniqueName: \"kubernetes.io/projected/f983b88e-aba3-4d49-bbd4-4db5eef5266c-kube-api-access-h6fb7\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607308 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607358 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.607378 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.608629 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f983b88e-aba3-4d49-bbd4-4db5eef5266c-log-httpd\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.608941 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f983b88e-aba3-4d49-bbd4-4db5eef5266c-run-httpd\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.613549 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.613658 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.614159 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-config-data\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.626024 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-scripts\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.630228 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fb7\" (UniqueName: \"kubernetes.io/projected/f983b88e-aba3-4d49-bbd4-4db5eef5266c-kube-api-access-h6fb7\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.638148 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f983b88e-aba3-4d49-bbd4-4db5eef5266c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f983b88e-aba3-4d49-bbd4-4db5eef5266c\") " pod="openstack/ceilometer-0" Mar 12 18:25:52 crc kubenswrapper[4926]: I0312 18:25:52.743827 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 18:25:53 crc kubenswrapper[4926]: I0312 18:25:53.268039 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 18:25:53 crc kubenswrapper[4926]: I0312 18:25:53.344106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f983b88e-aba3-4d49-bbd4-4db5eef5266c","Type":"ContainerStarted","Data":"efe78a1ebaf76bd5ca4c5c7d2389c61f86fefc1d3ee135c5723dc145b1fb67ee"} Mar 12 18:25:54 crc kubenswrapper[4926]: I0312 18:25:54.940740 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.084874 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-logs\") pod \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.084945 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-combined-ca-bundle\") pod \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.084992 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrb4\" (UniqueName: \"kubernetes.io/projected/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-kube-api-access-qsrb4\") pod \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.085195 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-config-data\") pod \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\" (UID: \"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4\") " Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.085642 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-logs" (OuterVolumeSpecName: "logs") pod "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" (UID: "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.085857 4926 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-logs\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.096418 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-kube-api-access-qsrb4" (OuterVolumeSpecName: "kube-api-access-qsrb4") pod "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" (UID: "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4"). InnerVolumeSpecName "kube-api-access-qsrb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.121627 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-config-data" (OuterVolumeSpecName: "config-data") pod "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" (UID: "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.121735 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" (UID: "d37e2fa1-9044-4dfb-9aea-bcb441cc91a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.187904 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.187936 4926 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.187947 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrb4\" (UniqueName: \"kubernetes.io/projected/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4-kube-api-access-qsrb4\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.363016 4926 generic.go:334] "Generic (PLEG): container finished" podID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerID="6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c" exitCode=0 Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.363096 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.363106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4","Type":"ContainerDied","Data":"6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c"} Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.363584 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d37e2fa1-9044-4dfb-9aea-bcb441cc91a4","Type":"ContainerDied","Data":"e038299ff9160ecc8a030d680499ae9a217c62acd0df47033360941b4347d3a1"} Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.363606 4926 scope.go:117] "RemoveContainer" containerID="6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.390677 4926 scope.go:117] "RemoveContainer" containerID="a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.417905 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.418150 4926 scope.go:117] "RemoveContainer" containerID="6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c" Mar 12 18:25:55 crc kubenswrapper[4926]: E0312 18:25:55.422118 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c\": container with ID starting with 6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c not found: ID does not exist" containerID="6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.422163 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c"} err="failed to get container status \"6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c\": rpc error: code = NotFound desc = could not find container \"6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c\": container with ID starting with 6fa6d015869bdf6e7a8cb47337d72a2bc01b6eda4f68a9902273a561f45b686c not found: ID does not exist" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.422187 4926 scope.go:117] "RemoveContainer" containerID="a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12" Mar 12 18:25:55 crc kubenswrapper[4926]: E0312 18:25:55.422451 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12\": container with ID starting with a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12 not found: ID does not exist" containerID="a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.422494 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12"} err="failed to get container status \"a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12\": rpc error: code = NotFound desc = could not find container \"a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12\": container with ID starting with a1113ff82e09dd62b2b62d25d27760593aaf0e686a84b599e8ba5e2556f57a12 not found: ID does not exist" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.431964 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.463728 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:55 crc kubenswrapper[4926]: E0312 18:25:55.464284 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-api" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.464310 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-api" Mar 12 18:25:55 crc kubenswrapper[4926]: E0312 18:25:55.464342 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-log" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.464350 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-log" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.464574 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-log" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.464613 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" containerName="nova-api-api" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.465802 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.468101 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.468477 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.472749 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.479479 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.595352 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.595476 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-config-data\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.595502 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r22k\" (UniqueName: \"kubernetes.io/projected/65b09111-c033-45e3-97d3-cd755e1a79ab-kube-api-access-9r22k\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.595615 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.595714 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-public-tls-certs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.595747 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b09111-c033-45e3-97d3-cd755e1a79ab-logs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.698623 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b09111-c033-45e3-97d3-cd755e1a79ab-logs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.698743 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.698799 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-config-data\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.698820 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r22k\" (UniqueName: \"kubernetes.io/projected/65b09111-c033-45e3-97d3-cd755e1a79ab-kube-api-access-9r22k\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.698919 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.698972 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-public-tls-certs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.699172 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b09111-c033-45e3-97d3-cd755e1a79ab-logs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.702710 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.702807 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.703083 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-config-data\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.703485 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65b09111-c033-45e3-97d3-cd755e1a79ab-public-tls-certs\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.718093 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r22k\" (UniqueName: \"kubernetes.io/projected/65b09111-c033-45e3-97d3-cd755e1a79ab-kube-api-access-9r22k\") pod \"nova-api-0\" (UID: \"65b09111-c033-45e3-97d3-cd755e1a79ab\") " pod="openstack/nova-api-0" Mar 12 18:25:55 crc kubenswrapper[4926]: I0312 18:25:55.788027 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 18:25:56 crc kubenswrapper[4926]: I0312 18:25:56.253542 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 18:25:56 crc kubenswrapper[4926]: I0312 18:25:56.373280 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65b09111-c033-45e3-97d3-cd755e1a79ab","Type":"ContainerStarted","Data":"ca502a96f6aafefcf167c3f572b8801d7bb44e17206f063d0973b12d38d5dc86"} Mar 12 18:25:56 crc kubenswrapper[4926]: I0312 18:25:56.503240 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d37e2fa1-9044-4dfb-9aea-bcb441cc91a4" path="/var/lib/kubelet/pods/d37e2fa1-9044-4dfb-9aea-bcb441cc91a4/volumes" Mar 12 18:25:57 crc kubenswrapper[4926]: I0312 18:25:57.393009 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65b09111-c033-45e3-97d3-cd755e1a79ab","Type":"ContainerStarted","Data":"3d2fbf4f1a9d4f7d8849384d3e26061ddbfabbfd65065d6a6fc041ee4c650106"} Mar 12 18:25:57 crc kubenswrapper[4926]: I0312 18:25:57.393866 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65b09111-c033-45e3-97d3-cd755e1a79ab","Type":"ContainerStarted","Data":"cbfe1612678ced632699b1895e35ffde3fc90098b673e8dcf8c6607ca19a1e86"} Mar 12 18:25:57 crc kubenswrapper[4926]: I0312 18:25:57.430686 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.4306603190000002 podStartE2EDuration="2.430660319s" podCreationTimestamp="2026-03-12 18:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:25:57.416393941 +0000 UTC m=+1397.785020304" watchObservedRunningTime="2026-03-12 18:25:57.430660319 +0000 UTC m=+1397.799286682" Mar 12 18:25:58 crc kubenswrapper[4926]: I0312 18:25:58.406077 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f983b88e-aba3-4d49-bbd4-4db5eef5266c","Type":"ContainerStarted","Data":"f7f6ae8fa45482b7b8fa1323c6a6628ac7710b7c51525caebc28a710eed57fd0"} Mar 12 18:25:58 crc kubenswrapper[4926]: I0312 18:25:58.860634 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:25:58 crc kubenswrapper[4926]: I0312 18:25:58.921600 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jwx72"] Mar 12 18:25:58 crc kubenswrapper[4926]: I0312 18:25:58.921842 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerName="dnsmasq-dns" containerID="cri-o://9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3" gracePeriod=10 Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.413041 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.422061 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f983b88e-aba3-4d49-bbd4-4db5eef5266c","Type":"ContainerStarted","Data":"30fe6bd671ca7ba78619b5143656473d3732a17f263f07921efaad240adbb7e2"} Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.422150 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f983b88e-aba3-4d49-bbd4-4db5eef5266c","Type":"ContainerStarted","Data":"8833b4e39e8c00c9608f9b821b7134299d5fb3f79586aecd64a3adf71f01a68e"} Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.425191 4926 generic.go:334] "Generic (PLEG): container finished" podID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerID="9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3" exitCode=0 Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.425255 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" event={"ID":"458c7cf7-4e8a-4272-8940-1d730293a0ca","Type":"ContainerDied","Data":"9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3"} Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.425310 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" event={"ID":"458c7cf7-4e8a-4272-8940-1d730293a0ca","Type":"ContainerDied","Data":"820539f8ae4b6d013f20317aef2ceed339b3e7e027d5738aa8b884143c185f6f"} Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.425333 4926 scope.go:117] "RemoveContainer" containerID="9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.425817 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jwx72" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.457857 4926 scope.go:117] "RemoveContainer" containerID="12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.480992 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-sb\") pod \"458c7cf7-4e8a-4272-8940-1d730293a0ca\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.481040 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-nb\") pod \"458c7cf7-4e8a-4272-8940-1d730293a0ca\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.481098 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7xnp\" (UniqueName: \"kubernetes.io/projected/458c7cf7-4e8a-4272-8940-1d730293a0ca-kube-api-access-k7xnp\") pod \"458c7cf7-4e8a-4272-8940-1d730293a0ca\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.481173 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-svc\") pod \"458c7cf7-4e8a-4272-8940-1d730293a0ca\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.482303 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-config\") pod \"458c7cf7-4e8a-4272-8940-1d730293a0ca\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.482764 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-swift-storage-0\") pod \"458c7cf7-4e8a-4272-8940-1d730293a0ca\" (UID: \"458c7cf7-4e8a-4272-8940-1d730293a0ca\") " Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.484887 4926 scope.go:117] "RemoveContainer" containerID="9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3" Mar 12 18:25:59 crc kubenswrapper[4926]: E0312 18:25:59.485295 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3\": container with ID starting with 9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3 not found: ID does not exist" containerID="9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.485323 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3"} err="failed to get container status \"9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3\": rpc error: code = NotFound desc = could not find container \"9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3\": container with ID starting with 9aea8ffd7746f95a03c2a7236be4f05b709c69b3e443e5e730cc0dfa7993bfc3 not found: ID does not exist" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.485344 4926 scope.go:117] "RemoveContainer" containerID="12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491" Mar 12 18:25:59 crc kubenswrapper[4926]: E0312 18:25:59.485852 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491\": container with ID starting with 12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491 not found: ID does not exist" containerID="12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.485871 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491"} err="failed to get container status \"12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491\": rpc error: code = NotFound desc = could not find container \"12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491\": container with ID starting with 12cbd32c692806fa4d603607cb6ae7ddf7f80a9712bcefaec0094fde0defb491 not found: ID does not exist" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.489919 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458c7cf7-4e8a-4272-8940-1d730293a0ca-kube-api-access-k7xnp" (OuterVolumeSpecName: "kube-api-access-k7xnp") pod "458c7cf7-4e8a-4272-8940-1d730293a0ca" (UID: "458c7cf7-4e8a-4272-8940-1d730293a0ca"). InnerVolumeSpecName "kube-api-access-k7xnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.536228 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "458c7cf7-4e8a-4272-8940-1d730293a0ca" (UID: "458c7cf7-4e8a-4272-8940-1d730293a0ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.543954 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "458c7cf7-4e8a-4272-8940-1d730293a0ca" (UID: "458c7cf7-4e8a-4272-8940-1d730293a0ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.544190 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "458c7cf7-4e8a-4272-8940-1d730293a0ca" (UID: "458c7cf7-4e8a-4272-8940-1d730293a0ca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.545598 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-config" (OuterVolumeSpecName: "config") pod "458c7cf7-4e8a-4272-8940-1d730293a0ca" (UID: "458c7cf7-4e8a-4272-8940-1d730293a0ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.550537 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "458c7cf7-4e8a-4272-8940-1d730293a0ca" (UID: "458c7cf7-4e8a-4272-8940-1d730293a0ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.584580 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7xnp\" (UniqueName: \"kubernetes.io/projected/458c7cf7-4e8a-4272-8940-1d730293a0ca-kube-api-access-k7xnp\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.584619 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.584629 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.584640 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.584649 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.584658 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/458c7cf7-4e8a-4272-8940-1d730293a0ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.762077 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jwx72"] Mar 12 18:25:59 crc kubenswrapper[4926]: I0312 18:25:59.769349 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jwx72"] Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.137708 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555666-x5zgg"] Mar 12 18:26:00 crc kubenswrapper[4926]: E0312 18:26:00.138413 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerName="init" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.138429 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerName="init" Mar 12 18:26:00 crc kubenswrapper[4926]: E0312 18:26:00.138477 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerName="dnsmasq-dns" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.138485 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerName="dnsmasq-dns" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.138690 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" containerName="dnsmasq-dns" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.139407 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.141951 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.142265 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.143038 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.147579 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555666-x5zgg"] Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.195636 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj8b2\" (UniqueName: \"kubernetes.io/projected/5ff186e2-5cbe-493a-b911-426e982888cb-kube-api-access-fj8b2\") pod \"auto-csr-approver-29555666-x5zgg\" (UID: \"5ff186e2-5cbe-493a-b911-426e982888cb\") " pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.298122 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj8b2\" (UniqueName: \"kubernetes.io/projected/5ff186e2-5cbe-493a-b911-426e982888cb-kube-api-access-fj8b2\") pod \"auto-csr-approver-29555666-x5zgg\" (UID: \"5ff186e2-5cbe-493a-b911-426e982888cb\") " pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.322523 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj8b2\" (UniqueName: \"kubernetes.io/projected/5ff186e2-5cbe-493a-b911-426e982888cb-kube-api-access-fj8b2\") pod \"auto-csr-approver-29555666-x5zgg\" (UID: \"5ff186e2-5cbe-493a-b911-426e982888cb\") " pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.470657 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.557349 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458c7cf7-4e8a-4272-8940-1d730293a0ca" path="/var/lib/kubelet/pods/458c7cf7-4e8a-4272-8940-1d730293a0ca/volumes" Mar 12 18:26:00 crc kubenswrapper[4926]: I0312 18:26:00.994173 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555666-x5zgg"] Mar 12 18:26:01 crc kubenswrapper[4926]: W0312 18:26:01.008894 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ff186e2_5cbe_493a_b911_426e982888cb.slice/crio-c1f487ad4c77fc31fb454e4d9a71da5681d79d268758afebdd45515eec4de874 WatchSource:0}: Error finding container c1f487ad4c77fc31fb454e4d9a71da5681d79d268758afebdd45515eec4de874: Status 404 returned error can't find the container with id c1f487ad4c77fc31fb454e4d9a71da5681d79d268758afebdd45515eec4de874 Mar 12 18:26:01 crc kubenswrapper[4926]: I0312 18:26:01.450764 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f983b88e-aba3-4d49-bbd4-4db5eef5266c","Type":"ContainerStarted","Data":"bac49af58acca4e456a30754fa1a58fd442847e42e37e4e2d91dcb1b439fd8e0"} Mar 12 18:26:01 crc kubenswrapper[4926]: I0312 18:26:01.451123 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 18:26:01 crc kubenswrapper[4926]: I0312 18:26:01.452477 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" event={"ID":"5ff186e2-5cbe-493a-b911-426e982888cb","Type":"ContainerStarted","Data":"c1f487ad4c77fc31fb454e4d9a71da5681d79d268758afebdd45515eec4de874"} Mar 12 18:26:01 crc kubenswrapper[4926]: I0312 18:26:01.501247 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.589875358 podStartE2EDuration="9.501213248s" podCreationTimestamp="2026-03-12 18:25:52 +0000 UTC" firstStartedPulling="2026-03-12 18:25:53.270075114 +0000 UTC m=+1393.638701457" lastFinishedPulling="2026-03-12 18:26:01.181413014 +0000 UTC m=+1401.550039347" observedRunningTime="2026-03-12 18:26:01.485674211 +0000 UTC m=+1401.854300544" watchObservedRunningTime="2026-03-12 18:26:01.501213248 +0000 UTC m=+1401.869839601" Mar 12 18:26:03 crc kubenswrapper[4926]: I0312 18:26:03.476002 4926 generic.go:334] "Generic (PLEG): container finished" podID="5ff186e2-5cbe-493a-b911-426e982888cb" containerID="303e0f37b5ff32adc0c5ce3a0b0f3252de8de58704f569f8ef9bdf6d235b21d1" exitCode=0 Mar 12 18:26:03 crc kubenswrapper[4926]: I0312 18:26:03.476077 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" event={"ID":"5ff186e2-5cbe-493a-b911-426e982888cb","Type":"ContainerDied","Data":"303e0f37b5ff32adc0c5ce3a0b0f3252de8de58704f569f8ef9bdf6d235b21d1"} Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:04.907590 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.022618 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj8b2\" (UniqueName: \"kubernetes.io/projected/5ff186e2-5cbe-493a-b911-426e982888cb-kube-api-access-fj8b2\") pod \"5ff186e2-5cbe-493a-b911-426e982888cb\" (UID: \"5ff186e2-5cbe-493a-b911-426e982888cb\") " Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.037826 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff186e2-5cbe-493a-b911-426e982888cb-kube-api-access-fj8b2" (OuterVolumeSpecName: "kube-api-access-fj8b2") pod "5ff186e2-5cbe-493a-b911-426e982888cb" (UID: "5ff186e2-5cbe-493a-b911-426e982888cb"). InnerVolumeSpecName "kube-api-access-fj8b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.125099 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj8b2\" (UniqueName: \"kubernetes.io/projected/5ff186e2-5cbe-493a-b911-426e982888cb-kube-api-access-fj8b2\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.502716 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" event={"ID":"5ff186e2-5cbe-493a-b911-426e982888cb","Type":"ContainerDied","Data":"c1f487ad4c77fc31fb454e4d9a71da5681d79d268758afebdd45515eec4de874"} Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.503103 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f487ad4c77fc31fb454e4d9a71da5681d79d268758afebdd45515eec4de874" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.503170 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555666-x5zgg" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.789027 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.789111 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.980194 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555660-cbx5b"] Mar 12 18:26:05 crc kubenswrapper[4926]: I0312 18:26:05.992198 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555660-cbx5b"] Mar 12 18:26:06 crc kubenswrapper[4926]: I0312 18:26:06.509275 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50fb579-57d6-4029-a4f3-c8a3303bac4d" path="/var/lib/kubelet/pods/b50fb579-57d6-4029-a4f3-c8a3303bac4d/volumes" Mar 12 18:26:06 crc kubenswrapper[4926]: I0312 18:26:06.800580 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65b09111-c033-45e3-97d3-cd755e1a79ab" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 18:26:06 crc kubenswrapper[4926]: I0312 18:26:06.800650 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65b09111-c033-45e3-97d3-cd755e1a79ab" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 18:26:14 crc kubenswrapper[4926]: I0312 18:26:14.202205 4926 scope.go:117] "RemoveContainer" containerID="6f2e2b241a59eb4ec9fee306fab9cd670167a7fcd068ac1d7f97bc699a2c0ee6" Mar 12 18:26:15 crc kubenswrapper[4926]: I0312 18:26:15.797335 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 18:26:15 crc kubenswrapper[4926]: I0312 18:26:15.798023 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 18:26:15 crc kubenswrapper[4926]: I0312 18:26:15.802073 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 18:26:15 crc kubenswrapper[4926]: I0312 18:26:15.813256 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 18:26:16 crc kubenswrapper[4926]: I0312 18:26:16.614331 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 18:26:16 crc kubenswrapper[4926]: I0312 18:26:16.626928 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 18:26:22 crc kubenswrapper[4926]: I0312 18:26:22.762290 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 18:26:31 crc kubenswrapper[4926]: I0312 18:26:31.677341 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:26:33 crc kubenswrapper[4926]: I0312 18:26:33.509528 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:26:35 crc kubenswrapper[4926]: I0312 18:26:35.779090 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="rabbitmq" containerID="cri-o://ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a" gracePeriod=604796 Mar 12 18:26:37 crc kubenswrapper[4926]: I0312 18:26:37.483270 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="rabbitmq" containerID="cri-o://3e4a5dd026300b0565c4a11203b217973524caa9bcc9839e71d49583b866e88e" gracePeriod=604797 Mar 12 18:26:37 crc kubenswrapper[4926]: I0312 18:26:37.504078 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 12 18:26:37 crc kubenswrapper[4926]: I0312 18:26:37.770388 4926 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.430626 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.635915 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-plugins-conf\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.635969 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-config-data\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636021 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7kq2\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-kube-api-access-x7kq2\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636162 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-plugins\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636239 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9c04aaec-485d-492f-8c24-e6860d9c78f7-pod-info\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636303 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-confd\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636359 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636391 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-erlang-cookie\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636470 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-server-conf\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636499 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9c04aaec-485d-492f-8c24-e6860d9c78f7-erlang-cookie-secret\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.636539 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-tls\") pod \"9c04aaec-485d-492f-8c24-e6860d9c78f7\" (UID: \"9c04aaec-485d-492f-8c24-e6860d9c78f7\") " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.639721 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.645497 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.646985 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9c04aaec-485d-492f-8c24-e6860d9c78f7-pod-info" (OuterVolumeSpecName: "pod-info") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.647019 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-kube-api-access-x7kq2" (OuterVolumeSpecName: "kube-api-access-x7kq2") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "kube-api-access-x7kq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.647759 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.652516 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.656402 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c04aaec-485d-492f-8c24-e6860d9c78f7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.689563 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.692392 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-config-data" (OuterVolumeSpecName: "config-data") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.723349 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-server-conf" (OuterVolumeSpecName: "server-conf") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740195 4926 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740225 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740237 4926 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740247 4926 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9c04aaec-485d-492f-8c24-e6860d9c78f7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740288 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740297 4926 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740308 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c04aaec-485d-492f-8c24-e6860d9c78f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740317 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7kq2\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-kube-api-access-x7kq2\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740326 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.740334 4926 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9c04aaec-485d-492f-8c24-e6860d9c78f7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.766080 4926 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.794661 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9c04aaec-485d-492f-8c24-e6860d9c78f7" (UID: "9c04aaec-485d-492f-8c24-e6860d9c78f7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.840933 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9c04aaec-485d-492f-8c24-e6860d9c78f7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.840969 4926 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.920698 4926 generic.go:334] "Generic (PLEG): container finished" podID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerID="ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a" exitCode=0 Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.920777 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.920788 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9c04aaec-485d-492f-8c24-e6860d9c78f7","Type":"ContainerDied","Data":"ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a"} Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.920837 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9c04aaec-485d-492f-8c24-e6860d9c78f7","Type":"ContainerDied","Data":"1dfe0defccd325a5f2aba559c2e683f83b8337f73cdd57e5ea2b609807eb38bf"} Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.920855 4926 scope.go:117] "RemoveContainer" containerID="ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.944315 4926 scope.go:117] "RemoveContainer" containerID="68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.963883 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.971248 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.984100 4926 scope.go:117] "RemoveContainer" containerID="ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a" Mar 12 18:26:42 crc kubenswrapper[4926]: E0312 18:26:42.984707 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a\": container with ID starting with ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a not found: ID does not exist" containerID="ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.984737 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a"} err="failed to get container status \"ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a\": rpc error: code = NotFound desc = could not find container \"ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a\": container with ID starting with ece7b3f0f166f5f8791a6a655bf54bcbe43cea3c1a722f7e373529011b357a1a not found: ID does not exist" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.984762 4926 scope.go:117] "RemoveContainer" containerID="68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5" Mar 12 18:26:42 crc kubenswrapper[4926]: E0312 18:26:42.985179 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5\": container with ID starting with 68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5 not found: ID does not exist" containerID="68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.985209 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5"} err="failed to get container status \"68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5\": rpc error: code = NotFound desc = could not find container \"68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5\": container with ID starting with 68d563b06d81d99aae7648289259410aed203cf265688cfbf80718ef20c0aca5 not found: ID does not exist" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.999348 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:26:42 crc kubenswrapper[4926]: E0312 18:26:42.999844 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff186e2-5cbe-493a-b911-426e982888cb" containerName="oc" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.999867 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff186e2-5cbe-493a-b911-426e982888cb" containerName="oc" Mar 12 18:26:42 crc kubenswrapper[4926]: E0312 18:26:42.999894 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="setup-container" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.999905 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="setup-container" Mar 12 18:26:42 crc kubenswrapper[4926]: E0312 18:26:42.999948 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="rabbitmq" Mar 12 18:26:42 crc kubenswrapper[4926]: I0312 18:26:42.999957 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="rabbitmq" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.000195 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" containerName="rabbitmq" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.000216 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff186e2-5cbe-493a-b911-426e982888cb" containerName="oc" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.001733 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005176 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005476 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005618 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k928p" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005659 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005689 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005622 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.005629 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.021418 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149017 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149106 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149150 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149245 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aba0434-585e-4355-8019-1612400b2350-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149590 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7px9q\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-kube-api-access-7px9q\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149711 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-config-data\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.149831 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.150124 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.150376 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aba0434-585e-4355-8019-1612400b2350-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.150462 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.150536 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252617 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252689 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252736 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aba0434-585e-4355-8019-1612400b2350-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252754 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252772 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252793 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252809 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252834 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252853 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aba0434-585e-4355-8019-1612400b2350-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252894 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7px9q\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-kube-api-access-7px9q\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.252913 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-config-data\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.253891 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-config-data\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.254964 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.255857 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9aba0434-585e-4355-8019-1612400b2350-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.256499 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.256902 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.257623 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.260361 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.259651 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9aba0434-585e-4355-8019-1612400b2350-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.264283 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9aba0434-585e-4355-8019-1612400b2350-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.266250 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.281151 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7px9q\" (UniqueName: \"kubernetes.io/projected/9aba0434-585e-4355-8019-1612400b2350-kube-api-access-7px9q\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.301889 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"9aba0434-585e-4355-8019-1612400b2350\") " pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.387012 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.884502 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.932485 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aba0434-585e-4355-8019-1612400b2350","Type":"ContainerStarted","Data":"bdc479238c3510dba7a47c2f6b2e81b6d4896fc6339ded89c262931f2379255c"} Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.934512 4926 generic.go:334] "Generic (PLEG): container finished" podID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerID="3e4a5dd026300b0565c4a11203b217973524caa9bcc9839e71d49583b866e88e" exitCode=0 Mar 12 18:26:43 crc kubenswrapper[4926]: I0312 18:26:43.934602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"06f09c04-6c8d-4c47-a0a5-59def6ebbf94","Type":"ContainerDied","Data":"3e4a5dd026300b0565c4a11203b217973524caa9bcc9839e71d49583b866e88e"} Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.212700 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379258 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-pod-info\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379646 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379679 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-erlang-cookie-secret\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379727 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-server-conf\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379765 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-erlang-cookie\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379803 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kzhl\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-kube-api-access-8kzhl\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379858 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-tls\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379942 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-plugins-conf\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.379972 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-confd\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.380007 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-plugins\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.380057 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-config-data\") pod \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\" (UID: \"06f09c04-6c8d-4c47-a0a5-59def6ebbf94\") " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.385079 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.385355 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.388265 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.388682 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-kube-api-access-8kzhl" (OuterVolumeSpecName: "kube-api-access-8kzhl") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "kube-api-access-8kzhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.389628 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.392573 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-pod-info" (OuterVolumeSpecName: "pod-info") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.392705 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.403465 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.439303 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-config-data" (OuterVolumeSpecName: "config-data") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.466777 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-server-conf" (OuterVolumeSpecName: "server-conf") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481868 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481898 4926 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481908 4926 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481935 4926 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481946 4926 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481957 4926 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481966 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481977 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kzhl\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-kube-api-access-8kzhl\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481985 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.481993 4926 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.502190 4926 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.502256 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c04aaec-485d-492f-8c24-e6860d9c78f7" path="/var/lib/kubelet/pods/9c04aaec-485d-492f-8c24-e6860d9c78f7/volumes" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.546881 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "06f09c04-6c8d-4c47-a0a5-59def6ebbf94" (UID: "06f09c04-6c8d-4c47-a0a5-59def6ebbf94"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.584052 4926 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/06f09c04-6c8d-4c47-a0a5-59def6ebbf94-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.584086 4926 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.955661 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"06f09c04-6c8d-4c47-a0a5-59def6ebbf94","Type":"ContainerDied","Data":"851881376da52e23e91e40413019b432ba8d1a58a75cb35642884721899d9a50"} Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.955723 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:44 crc kubenswrapper[4926]: I0312 18:26:44.955735 4926 scope.go:117] "RemoveContainer" containerID="3e4a5dd026300b0565c4a11203b217973524caa9bcc9839e71d49583b866e88e" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.005647 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.026502 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.075203 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:26:45 crc kubenswrapper[4926]: E0312 18:26:45.075670 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="rabbitmq" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.075685 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="rabbitmq" Mar 12 18:26:45 crc kubenswrapper[4926]: E0312 18:26:45.075699 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="setup-container" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.075705 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="setup-container" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.075858 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" containerName="rabbitmq" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.087922 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.090585 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.090797 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.094029 4926 scope.go:117] "RemoveContainer" containerID="cfa97fbf75b6c2e852f5d66a227450ea6c8da1b0641deb3da6fb3c35ba9a0f12" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.097344 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.097526 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.102867 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.103237 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r4knd" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.103480 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.103667 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.197125 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b82f03-7ac1-4805-858b-708760b4e476-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.197731 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.197902 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198006 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198104 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b82f03-7ac1-4805-858b-708760b4e476-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198236 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198353 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198470 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198544 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198636 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rzmd\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-kube-api-access-6rzmd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.198717 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301029 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301290 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301383 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301498 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301612 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rzmd\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-kube-api-access-6rzmd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301744 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301860 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b82f03-7ac1-4805-858b-708760b4e476-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.301984 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.302560 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.302826 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.303417 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.303489 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.303078 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.303388 4926 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.303496 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b82f03-7ac1-4805-858b-708760b4e476-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.303742 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.304076 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b82f03-7ac1-4805-858b-708760b4e476-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.470231 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.471848 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b82f03-7ac1-4805-858b-708760b4e476-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.471855 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b82f03-7ac1-4805-858b-708760b4e476-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.472116 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rzmd\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-kube-api-access-6rzmd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.474111 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b82f03-7ac1-4805-858b-708760b4e476-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.538807 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b9b82f03-7ac1-4805-858b-708760b4e476\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.716327 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:26:45 crc kubenswrapper[4926]: I0312 18:26:45.985106 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aba0434-585e-4355-8019-1612400b2350","Type":"ContainerStarted","Data":"f428e986bfed0b99e85dec3bb535e7ec0c4eb9551d8016425ac735d81340c54d"} Mar 12 18:26:46 crc kubenswrapper[4926]: I0312 18:26:46.094778 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 18:26:46 crc kubenswrapper[4926]: I0312 18:26:46.504221 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f09c04-6c8d-4c47-a0a5-59def6ebbf94" path="/var/lib/kubelet/pods/06f09c04-6c8d-4c47-a0a5-59def6ebbf94/volumes" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.004881 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9b82f03-7ac1-4805-858b-708760b4e476","Type":"ContainerStarted","Data":"c71d575325dd99be617816314834cd889a5cd67eb4b3f3e33266b5be5f6a25d4"} Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.793067 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-4ldnw"] Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.796608 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.799941 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.806159 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-4ldnw"] Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.874696 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.874761 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.874801 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-config\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.874868 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvxr\" (UniqueName: \"kubernetes.io/projected/ce48eb28-755b-4a18-8bf1-212baff1f7fc-kube-api-access-5fvxr\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.874946 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.874978 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.875044 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.977429 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.977860 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.977971 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.978064 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-config\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.978173 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvxr\" (UniqueName: \"kubernetes.io/projected/ce48eb28-755b-4a18-8bf1-212baff1f7fc-kube-api-access-5fvxr\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.978296 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.978387 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.980033 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.980891 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.982790 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-config\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.983293 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.984149 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:47 crc kubenswrapper[4926]: I0312 18:26:47.986070 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:48 crc kubenswrapper[4926]: I0312 18:26:48.037705 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvxr\" (UniqueName: \"kubernetes.io/projected/ce48eb28-755b-4a18-8bf1-212baff1f7fc-kube-api-access-5fvxr\") pod \"dnsmasq-dns-79bd4cc8c9-4ldnw\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:48 crc kubenswrapper[4926]: I0312 18:26:48.045999 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9b82f03-7ac1-4805-858b-708760b4e476","Type":"ContainerStarted","Data":"b958f13a715a00e9a33abc7ba17caf342fd31d1ad60a10155cff9bad1045c238"} Mar 12 18:26:48 crc kubenswrapper[4926]: I0312 18:26:48.116644 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:48 crc kubenswrapper[4926]: I0312 18:26:48.618866 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-4ldnw"] Mar 12 18:26:49 crc kubenswrapper[4926]: I0312 18:26:49.071593 4926 generic.go:334] "Generic (PLEG): container finished" podID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerID="5c57deb5d6d68579f57781cc73b1f95fd91c467933afbbc36484489589a3e08d" exitCode=0 Mar 12 18:26:49 crc kubenswrapper[4926]: I0312 18:26:49.073133 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" event={"ID":"ce48eb28-755b-4a18-8bf1-212baff1f7fc","Type":"ContainerDied","Data":"5c57deb5d6d68579f57781cc73b1f95fd91c467933afbbc36484489589a3e08d"} Mar 12 18:26:49 crc kubenswrapper[4926]: I0312 18:26:49.073212 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" event={"ID":"ce48eb28-755b-4a18-8bf1-212baff1f7fc","Type":"ContainerStarted","Data":"73aeb25dea0852f705f0adfdaeaab49793c64b81144d0bee56d28bfe72283a50"} Mar 12 18:26:50 crc kubenswrapper[4926]: I0312 18:26:50.092789 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" event={"ID":"ce48eb28-755b-4a18-8bf1-212baff1f7fc","Type":"ContainerStarted","Data":"31005cd2c4f0038c1a2e2165529b7d457634779c21c8fccaed21553ed78e6d14"} Mar 12 18:26:50 crc kubenswrapper[4926]: I0312 18:26:50.093777 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:50 crc kubenswrapper[4926]: I0312 18:26:50.124195 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" podStartSLOduration=3.124165453 podStartE2EDuration="3.124165453s" podCreationTimestamp="2026-03-12 18:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:26:50.117928068 +0000 UTC m=+1450.486554431" watchObservedRunningTime="2026-03-12 18:26:50.124165453 +0000 UTC m=+1450.492791806" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.117766 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.190160 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-nfj99"] Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.190905 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerName="dnsmasq-dns" containerID="cri-o://c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c" gracePeriod=10 Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.426149 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fc8b56cf-6cbwx"] Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.427998 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.438653 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc8b56cf-6cbwx"] Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.496164 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-dns-swift-storage-0\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.496226 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-openstack-edpm-ipam\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.496270 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-dns-svc\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.496309 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.496803 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-config\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.497031 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.497103 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcwn\" (UniqueName: \"kubernetes.io/projected/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-kube-api-access-trcwn\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.602836 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-dns-swift-storage-0\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.603183 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-openstack-edpm-ipam\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.603242 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-dns-svc\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.603284 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.603349 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-config\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.603420 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.603468 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcwn\" (UniqueName: \"kubernetes.io/projected/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-kube-api-access-trcwn\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.604470 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-ovsdbserver-nb\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.605510 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-dns-swift-storage-0\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.606142 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-openstack-edpm-ipam\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.606201 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-config\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.606796 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-dns-svc\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.607358 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-ovsdbserver-sb\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.636234 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcwn\" (UniqueName: \"kubernetes.io/projected/6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69-kube-api-access-trcwn\") pod \"dnsmasq-dns-5fc8b56cf-6cbwx\" (UID: \"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69\") " pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.693263 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.704499 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-swift-storage-0\") pod \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.704592 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-sb\") pod \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.704672 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-config\") pod \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.704695 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-nb\") pod \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.704724 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-svc\") pod \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.704750 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swntz\" (UniqueName: \"kubernetes.io/projected/99d97cdd-8bee-43b1-a07c-fee61fceff3a-kube-api-access-swntz\") pod \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\" (UID: \"99d97cdd-8bee-43b1-a07c-fee61fceff3a\") " Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.709278 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d97cdd-8bee-43b1-a07c-fee61fceff3a-kube-api-access-swntz" (OuterVolumeSpecName: "kube-api-access-swntz") pod "99d97cdd-8bee-43b1-a07c-fee61fceff3a" (UID: "99d97cdd-8bee-43b1-a07c-fee61fceff3a"). InnerVolumeSpecName "kube-api-access-swntz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.758572 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.773230 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99d97cdd-8bee-43b1-a07c-fee61fceff3a" (UID: "99d97cdd-8bee-43b1-a07c-fee61fceff3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.795309 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99d97cdd-8bee-43b1-a07c-fee61fceff3a" (UID: "99d97cdd-8bee-43b1-a07c-fee61fceff3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.800136 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-config" (OuterVolumeSpecName: "config") pod "99d97cdd-8bee-43b1-a07c-fee61fceff3a" (UID: "99d97cdd-8bee-43b1-a07c-fee61fceff3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.807054 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.807086 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.807098 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.807106 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swntz\" (UniqueName: \"kubernetes.io/projected/99d97cdd-8bee-43b1-a07c-fee61fceff3a-kube-api-access-swntz\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.818668 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99d97cdd-8bee-43b1-a07c-fee61fceff3a" (UID: "99d97cdd-8bee-43b1-a07c-fee61fceff3a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.826040 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99d97cdd-8bee-43b1-a07c-fee61fceff3a" (UID: "99d97cdd-8bee-43b1-a07c-fee61fceff3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.909562 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:58 crc kubenswrapper[4926]: I0312 18:26:58.909613 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99d97cdd-8bee-43b1-a07c-fee61fceff3a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.200230 4926 generic.go:334] "Generic (PLEG): container finished" podID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerID="c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c" exitCode=0 Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.200356 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.200390 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" event={"ID":"99d97cdd-8bee-43b1-a07c-fee61fceff3a","Type":"ContainerDied","Data":"c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c"} Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.200773 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-nfj99" event={"ID":"99d97cdd-8bee-43b1-a07c-fee61fceff3a","Type":"ContainerDied","Data":"9711fea2cdf454dba0540ad56623bf62d55c2c749c2e71e1fe9a2ccbb9165424"} Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.200810 4926 scope.go:117] "RemoveContainer" containerID="c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.214004 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fc8b56cf-6cbwx"] Mar 12 18:26:59 crc kubenswrapper[4926]: W0312 18:26:59.218568 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f42a0ec_9b43_4a24_b5b2_89cd0f3abe69.slice/crio-7d27be680723fe4958ac9fa980d95153baa0ef070f295c6dfa217c24bf474a33 WatchSource:0}: Error finding container 7d27be680723fe4958ac9fa980d95153baa0ef070f295c6dfa217c24bf474a33: Status 404 returned error can't find the container with id 7d27be680723fe4958ac9fa980d95153baa0ef070f295c6dfa217c24bf474a33 Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.237174 4926 scope.go:117] "RemoveContainer" containerID="30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.258093 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-nfj99"] Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.264942 4926 scope.go:117] "RemoveContainer" containerID="c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.265003 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-nfj99"] Mar 12 18:26:59 crc kubenswrapper[4926]: E0312 18:26:59.265392 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c\": container with ID starting with c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c not found: ID does not exist" containerID="c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.265422 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c"} err="failed to get container status \"c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c\": rpc error: code = NotFound desc = could not find container \"c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c\": container with ID starting with c6990a64a13f798700cc9711674b7da102c1f60c04f410a014b3f9abaf48412c not found: ID does not exist" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.265516 4926 scope.go:117] "RemoveContainer" containerID="30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0" Mar 12 18:26:59 crc kubenswrapper[4926]: E0312 18:26:59.266016 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0\": container with ID starting with 30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0 not found: ID does not exist" containerID="30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0" Mar 12 18:26:59 crc kubenswrapper[4926]: I0312 18:26:59.266092 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0"} err="failed to get container status \"30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0\": rpc error: code = NotFound desc = could not find container \"30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0\": container with ID starting with 30acc030e2a3b017e634964990b6965f4e920c9d854391c2c49981117f81a7d0 not found: ID does not exist" Mar 12 18:27:00 crc kubenswrapper[4926]: I0312 18:27:00.211086 4926 generic.go:334] "Generic (PLEG): container finished" podID="6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69" containerID="e9a033282adcd73253cdaa2d902a610163965643025af868c4695b8f53713492" exitCode=0 Mar 12 18:27:00 crc kubenswrapper[4926]: I0312 18:27:00.211200 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" event={"ID":"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69","Type":"ContainerDied","Data":"e9a033282adcd73253cdaa2d902a610163965643025af868c4695b8f53713492"} Mar 12 18:27:00 crc kubenswrapper[4926]: I0312 18:27:00.211390 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" event={"ID":"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69","Type":"ContainerStarted","Data":"7d27be680723fe4958ac9fa980d95153baa0ef070f295c6dfa217c24bf474a33"} Mar 12 18:27:00 crc kubenswrapper[4926]: I0312 18:27:00.501447 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" path="/var/lib/kubelet/pods/99d97cdd-8bee-43b1-a07c-fee61fceff3a/volumes" Mar 12 18:27:01 crc kubenswrapper[4926]: I0312 18:27:01.225331 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" event={"ID":"6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69","Type":"ContainerStarted","Data":"f8918c17070f0ee95d203429bd314511a031eb1b8d8021bb41f8cd1c1487af96"} Mar 12 18:27:01 crc kubenswrapper[4926]: I0312 18:27:01.228464 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:27:01 crc kubenswrapper[4926]: I0312 18:27:01.259619 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" podStartSLOduration=3.25957685 podStartE2EDuration="3.25957685s" podCreationTimestamp="2026-03-12 18:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:27:01.2471572 +0000 UTC m=+1461.615783563" watchObservedRunningTime="2026-03-12 18:27:01.25957685 +0000 UTC m=+1461.628203183" Mar 12 18:27:08 crc kubenswrapper[4926]: I0312 18:27:08.761756 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fc8b56cf-6cbwx" Mar 12 18:27:08 crc kubenswrapper[4926]: I0312 18:27:08.861485 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-4ldnw"] Mar 12 18:27:08 crc kubenswrapper[4926]: I0312 18:27:08.861822 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerName="dnsmasq-dns" containerID="cri-o://31005cd2c4f0038c1a2e2165529b7d457634779c21c8fccaed21553ed78e6d14" gracePeriod=10 Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.320539 4926 generic.go:334] "Generic (PLEG): container finished" podID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerID="31005cd2c4f0038c1a2e2165529b7d457634779c21c8fccaed21553ed78e6d14" exitCode=0 Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.320602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" event={"ID":"ce48eb28-755b-4a18-8bf1-212baff1f7fc","Type":"ContainerDied","Data":"31005cd2c4f0038c1a2e2165529b7d457634779c21c8fccaed21553ed78e6d14"} Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.462291 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.565976 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-nb\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.566091 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fvxr\" (UniqueName: \"kubernetes.io/projected/ce48eb28-755b-4a18-8bf1-212baff1f7fc-kube-api-access-5fvxr\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.566180 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-config\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.566259 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-openstack-edpm-ipam\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.566318 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-swift-storage-0\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.566393 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-sb\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.566425 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-svc\") pod \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\" (UID: \"ce48eb28-755b-4a18-8bf1-212baff1f7fc\") " Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.576510 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce48eb28-755b-4a18-8bf1-212baff1f7fc-kube-api-access-5fvxr" (OuterVolumeSpecName: "kube-api-access-5fvxr") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "kube-api-access-5fvxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.622908 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.626033 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.630206 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.631076 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.637086 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.644032 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-config" (OuterVolumeSpecName: "config") pod "ce48eb28-755b-4a18-8bf1-212baff1f7fc" (UID: "ce48eb28-755b-4a18-8bf1-212baff1f7fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669826 4926 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-config\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669858 4926 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669871 4926 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669881 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669892 4926 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669902 4926 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce48eb28-755b-4a18-8bf1-212baff1f7fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:09 crc kubenswrapper[4926]: I0312 18:27:09.669912 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fvxr\" (UniqueName: \"kubernetes.io/projected/ce48eb28-755b-4a18-8bf1-212baff1f7fc-kube-api-access-5fvxr\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.337602 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" event={"ID":"ce48eb28-755b-4a18-8bf1-212baff1f7fc","Type":"ContainerDied","Data":"73aeb25dea0852f705f0adfdaeaab49793c64b81144d0bee56d28bfe72283a50"} Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.337685 4926 scope.go:117] "RemoveContainer" containerID="31005cd2c4f0038c1a2e2165529b7d457634779c21c8fccaed21553ed78e6d14" Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.337693 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-4ldnw" Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.388096 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-4ldnw"] Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.388728 4926 scope.go:117] "RemoveContainer" containerID="5c57deb5d6d68579f57781cc73b1f95fd91c467933afbbc36484489589a3e08d" Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.405345 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-4ldnw"] Mar 12 18:27:10 crc kubenswrapper[4926]: I0312 18:27:10.504864 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" path="/var/lib/kubelet/pods/ce48eb28-755b-4a18-8bf1-212baff1f7fc/volumes" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.580082 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8cm8t"] Mar 12 18:27:13 crc kubenswrapper[4926]: E0312 18:27:13.581158 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerName="init" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.581175 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerName="init" Mar 12 18:27:13 crc kubenswrapper[4926]: E0312 18:27:13.581190 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerName="dnsmasq-dns" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.581196 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerName="dnsmasq-dns" Mar 12 18:27:13 crc kubenswrapper[4926]: E0312 18:27:13.581204 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerName="dnsmasq-dns" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.581211 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerName="dnsmasq-dns" Mar 12 18:27:13 crc kubenswrapper[4926]: E0312 18:27:13.581232 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerName="init" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.581238 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerName="init" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.581420 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d97cdd-8bee-43b1-a07c-fee61fceff3a" containerName="dnsmasq-dns" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.581453 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce48eb28-755b-4a18-8bf1-212baff1f7fc" containerName="dnsmasq-dns" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.582813 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.591345 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8cm8t"] Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.651535 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnlg2\" (UniqueName: \"kubernetes.io/projected/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-kube-api-access-bnlg2\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.651663 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-catalog-content\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.651730 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-utilities\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.755702 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnlg2\" (UniqueName: \"kubernetes.io/projected/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-kube-api-access-bnlg2\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.756029 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-catalog-content\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.756158 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-utilities\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.756894 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-utilities\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.757253 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-catalog-content\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.786580 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnlg2\" (UniqueName: \"kubernetes.io/projected/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-kube-api-access-bnlg2\") pod \"redhat-operators-8cm8t\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:13 crc kubenswrapper[4926]: I0312 18:27:13.965567 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.466411 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8cm8t"] Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.551413 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs"] Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.553713 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.555953 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.556039 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.556424 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.557003 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.591538 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs"] Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.674005 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fvjf\" (UniqueName: \"kubernetes.io/projected/58e2a40e-84ba-43fe-8087-ec2e917a2b34-kube-api-access-4fvjf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.674255 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.674298 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.674466 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.776081 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.776552 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.776658 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.776712 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fvjf\" (UniqueName: \"kubernetes.io/projected/58e2a40e-84ba-43fe-8087-ec2e917a2b34-kube-api-access-4fvjf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.784607 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.784921 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.785764 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.797877 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fvjf\" (UniqueName: \"kubernetes.io/projected/58e2a40e-84ba-43fe-8087-ec2e917a2b34-kube-api-access-4fvjf\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:14 crc kubenswrapper[4926]: I0312 18:27:14.976797 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:15 crc kubenswrapper[4926]: I0312 18:27:15.388199 4926 generic.go:334] "Generic (PLEG): container finished" podID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerID="c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045" exitCode=0 Mar 12 18:27:15 crc kubenswrapper[4926]: I0312 18:27:15.388361 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerDied","Data":"c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045"} Mar 12 18:27:15 crc kubenswrapper[4926]: I0312 18:27:15.388505 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerStarted","Data":"f146c71e8411ea1922f76fefbfa5aefdbf611c51e0b51cca9e5da48e6e433b90"} Mar 12 18:27:15 crc kubenswrapper[4926]: I0312 18:27:15.559503 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs"] Mar 12 18:27:16 crc kubenswrapper[4926]: I0312 18:27:16.414003 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" event={"ID":"58e2a40e-84ba-43fe-8087-ec2e917a2b34","Type":"ContainerStarted","Data":"79362af8345ee40914f1cad46dfe7258d5b7d2f58873fa86fb3735011e53e02e"} Mar 12 18:27:16 crc kubenswrapper[4926]: I0312 18:27:16.418176 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerStarted","Data":"6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124"} Mar 12 18:27:19 crc kubenswrapper[4926]: I0312 18:27:19.452494 4926 generic.go:334] "Generic (PLEG): container finished" podID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerID="6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124" exitCode=0 Mar 12 18:27:19 crc kubenswrapper[4926]: I0312 18:27:19.452551 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerDied","Data":"6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124"} Mar 12 18:27:20 crc kubenswrapper[4926]: I0312 18:27:20.464291 4926 generic.go:334] "Generic (PLEG): container finished" podID="9aba0434-585e-4355-8019-1612400b2350" containerID="f428e986bfed0b99e85dec3bb535e7ec0c4eb9551d8016425ac735d81340c54d" exitCode=0 Mar 12 18:27:20 crc kubenswrapper[4926]: I0312 18:27:20.464387 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aba0434-585e-4355-8019-1612400b2350","Type":"ContainerDied","Data":"f428e986bfed0b99e85dec3bb535e7ec0c4eb9551d8016425ac735d81340c54d"} Mar 12 18:27:21 crc kubenswrapper[4926]: I0312 18:27:21.482508 4926 generic.go:334] "Generic (PLEG): container finished" podID="b9b82f03-7ac1-4805-858b-708760b4e476" containerID="b958f13a715a00e9a33abc7ba17caf342fd31d1ad60a10155cff9bad1045c238" exitCode=0 Mar 12 18:27:21 crc kubenswrapper[4926]: I0312 18:27:21.482614 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9b82f03-7ac1-4805-858b-708760b4e476","Type":"ContainerDied","Data":"b958f13a715a00e9a33abc7ba17caf342fd31d1ad60a10155cff9bad1045c238"} Mar 12 18:27:26 crc kubenswrapper[4926]: I0312 18:27:26.818371 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:27:26 crc kubenswrapper[4926]: I0312 18:27:26.821553 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.573815 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b9b82f03-7ac1-4805-858b-708760b4e476","Type":"ContainerStarted","Data":"af39b4a41c7e394aacc8c9804a619a18aba2af552e6d25e9c1ebf02ad492191e"} Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.574517 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.577500 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9aba0434-585e-4355-8019-1612400b2350","Type":"ContainerStarted","Data":"dfba317cf8f9246b78f745195185ea9a285eabf0aed35addfc16d58bb52f783c"} Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.578011 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.580545 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" event={"ID":"58e2a40e-84ba-43fe-8087-ec2e917a2b34","Type":"ContainerStarted","Data":"e358207882b6752201641451ec266917c4f67e93fa12995e0e522f44e4c78575"} Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.583071 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerStarted","Data":"0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9"} Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.607133 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.607096246 podStartE2EDuration="42.607096246s" podCreationTimestamp="2026-03-12 18:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:27:27.596303938 +0000 UTC m=+1487.964930301" watchObservedRunningTime="2026-03-12 18:27:27.607096246 +0000 UTC m=+1487.975722579" Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.641827 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8cm8t" podStartSLOduration=3.702316326 podStartE2EDuration="14.641790215s" podCreationTimestamp="2026-03-12 18:27:13 +0000 UTC" firstStartedPulling="2026-03-12 18:27:15.390557515 +0000 UTC m=+1475.759183848" lastFinishedPulling="2026-03-12 18:27:26.330031404 +0000 UTC m=+1486.698657737" observedRunningTime="2026-03-12 18:27:27.62697491 +0000 UTC m=+1487.995601293" watchObservedRunningTime="2026-03-12 18:27:27.641790215 +0000 UTC m=+1488.010416548" Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.665357 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" podStartSLOduration=2.885864916 podStartE2EDuration="13.665327484s" podCreationTimestamp="2026-03-12 18:27:14 +0000 UTC" firstStartedPulling="2026-03-12 18:27:15.566792175 +0000 UTC m=+1475.935418508" lastFinishedPulling="2026-03-12 18:27:26.346254743 +0000 UTC m=+1486.714881076" observedRunningTime="2026-03-12 18:27:27.652490671 +0000 UTC m=+1488.021117014" watchObservedRunningTime="2026-03-12 18:27:27.665327484 +0000 UTC m=+1488.033953837" Mar 12 18:27:27 crc kubenswrapper[4926]: I0312 18:27:27.689123 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.689088109 podStartE2EDuration="45.689088109s" podCreationTimestamp="2026-03-12 18:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:27:27.679004873 +0000 UTC m=+1488.047631206" watchObservedRunningTime="2026-03-12 18:27:27.689088109 +0000 UTC m=+1488.057714452" Mar 12 18:27:33 crc kubenswrapper[4926]: I0312 18:27:33.965626 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:33 crc kubenswrapper[4926]: I0312 18:27:33.967852 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:27:35 crc kubenswrapper[4926]: I0312 18:27:35.015608 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8cm8t" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" probeResult="failure" output=< Mar 12 18:27:35 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:27:35 crc kubenswrapper[4926]: > Mar 12 18:27:37 crc kubenswrapper[4926]: I0312 18:27:37.729238 4926 generic.go:334] "Generic (PLEG): container finished" podID="58e2a40e-84ba-43fe-8087-ec2e917a2b34" containerID="e358207882b6752201641451ec266917c4f67e93fa12995e0e522f44e4c78575" exitCode=0 Mar 12 18:27:37 crc kubenswrapper[4926]: I0312 18:27:37.729322 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" event={"ID":"58e2a40e-84ba-43fe-8087-ec2e917a2b34","Type":"ContainerDied","Data":"e358207882b6752201641451ec266917c4f67e93fa12995e0e522f44e4c78575"} Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.230854 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.322391 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-inventory\") pod \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.322809 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fvjf\" (UniqueName: \"kubernetes.io/projected/58e2a40e-84ba-43fe-8087-ec2e917a2b34-kube-api-access-4fvjf\") pod \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.322938 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-ssh-key-openstack-edpm-ipam\") pod \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.323223 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-repo-setup-combined-ca-bundle\") pod \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\" (UID: \"58e2a40e-84ba-43fe-8087-ec2e917a2b34\") " Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.329223 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e2a40e-84ba-43fe-8087-ec2e917a2b34-kube-api-access-4fvjf" (OuterVolumeSpecName: "kube-api-access-4fvjf") pod "58e2a40e-84ba-43fe-8087-ec2e917a2b34" (UID: "58e2a40e-84ba-43fe-8087-ec2e917a2b34"). InnerVolumeSpecName "kube-api-access-4fvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.335707 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "58e2a40e-84ba-43fe-8087-ec2e917a2b34" (UID: "58e2a40e-84ba-43fe-8087-ec2e917a2b34"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.359310 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58e2a40e-84ba-43fe-8087-ec2e917a2b34" (UID: "58e2a40e-84ba-43fe-8087-ec2e917a2b34"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.365382 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-inventory" (OuterVolumeSpecName: "inventory") pod "58e2a40e-84ba-43fe-8087-ec2e917a2b34" (UID: "58e2a40e-84ba-43fe-8087-ec2e917a2b34"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.425587 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.425626 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fvjf\" (UniqueName: \"kubernetes.io/projected/58e2a40e-84ba-43fe-8087-ec2e917a2b34-kube-api-access-4fvjf\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.425639 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.425648 4926 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e2a40e-84ba-43fe-8087-ec2e917a2b34-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.753318 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" event={"ID":"58e2a40e-84ba-43fe-8087-ec2e917a2b34","Type":"ContainerDied","Data":"79362af8345ee40914f1cad46dfe7258d5b7d2f58873fa86fb3735011e53e02e"} Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.753662 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79362af8345ee40914f1cad46dfe7258d5b7d2f58873fa86fb3735011e53e02e" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.753369 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.850885 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z"] Mar 12 18:27:39 crc kubenswrapper[4926]: E0312 18:27:39.851294 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e2a40e-84ba-43fe-8087-ec2e917a2b34" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.851329 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e2a40e-84ba-43fe-8087-ec2e917a2b34" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.851592 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e2a40e-84ba-43fe-8087-ec2e917a2b34" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.852210 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.855966 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.856121 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.856379 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.856991 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.866110 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z"] Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.935588 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.935778 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.936184 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpczd\" (UniqueName: \"kubernetes.io/projected/75a3208b-42f5-412e-a503-ac328f7d9967-kube-api-access-mpczd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:39 crc kubenswrapper[4926]: I0312 18:27:39.936341 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.038539 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpczd\" (UniqueName: \"kubernetes.io/projected/75a3208b-42f5-412e-a503-ac328f7d9967-kube-api-access-mpczd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.038632 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.038719 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.038796 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.043620 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.044053 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.051218 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.066024 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpczd\" (UniqueName: \"kubernetes.io/projected/75a3208b-42f5-412e-a503-ac328f7d9967-kube-api-access-mpczd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.175732 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.763117 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z"] Mar 12 18:27:40 crc kubenswrapper[4926]: I0312 18:27:40.774913 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:27:41 crc kubenswrapper[4926]: I0312 18:27:41.246198 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:27:41 crc kubenswrapper[4926]: I0312 18:27:41.789404 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" event={"ID":"75a3208b-42f5-412e-a503-ac328f7d9967","Type":"ContainerStarted","Data":"6e0e0f76508b934742dbabb4b94eeab9a4fa625bd527c90d90fa0cda46ddea4a"} Mar 12 18:27:41 crc kubenswrapper[4926]: I0312 18:27:41.789818 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" event={"ID":"75a3208b-42f5-412e-a503-ac328f7d9967","Type":"ContainerStarted","Data":"beacf7f2c4ecf74d1f45669096c28cb46b36b877d88778fb003f1ce1efb01ae4"} Mar 12 18:27:41 crc kubenswrapper[4926]: I0312 18:27:41.817019 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" podStartSLOduration=2.348196392 podStartE2EDuration="2.81699493s" podCreationTimestamp="2026-03-12 18:27:39 +0000 UTC" firstStartedPulling="2026-03-12 18:27:40.774729893 +0000 UTC m=+1501.143356226" lastFinishedPulling="2026-03-12 18:27:41.243528421 +0000 UTC m=+1501.612154764" observedRunningTime="2026-03-12 18:27:41.807856718 +0000 UTC m=+1502.176483141" watchObservedRunningTime="2026-03-12 18:27:41.81699493 +0000 UTC m=+1502.185621263" Mar 12 18:27:43 crc kubenswrapper[4926]: I0312 18:27:43.390689 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 18:27:45 crc kubenswrapper[4926]: I0312 18:27:45.007666 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8cm8t" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" probeResult="failure" output=< Mar 12 18:27:45 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:27:45 crc kubenswrapper[4926]: > Mar 12 18:27:45 crc kubenswrapper[4926]: I0312 18:27:45.720641 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 18:27:55 crc kubenswrapper[4926]: I0312 18:27:55.010773 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8cm8t" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" probeResult="failure" output=< Mar 12 18:27:55 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:27:55 crc kubenswrapper[4926]: > Mar 12 18:27:56 crc kubenswrapper[4926]: I0312 18:27:56.817472 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:27:56 crc kubenswrapper[4926]: I0312 18:27:56.817779 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.145093 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555668-xnfj4"] Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.146925 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.150816 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.151019 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.152593 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.170880 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555668-xnfj4"] Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.308178 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjngx\" (UniqueName: \"kubernetes.io/projected/ddbd98cc-c135-485c-9b18-c31f19321a38-kube-api-access-xjngx\") pod \"auto-csr-approver-29555668-xnfj4\" (UID: \"ddbd98cc-c135-485c-9b18-c31f19321a38\") " pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.410133 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjngx\" (UniqueName: \"kubernetes.io/projected/ddbd98cc-c135-485c-9b18-c31f19321a38-kube-api-access-xjngx\") pod \"auto-csr-approver-29555668-xnfj4\" (UID: \"ddbd98cc-c135-485c-9b18-c31f19321a38\") " pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.454053 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjngx\" (UniqueName: \"kubernetes.io/projected/ddbd98cc-c135-485c-9b18-c31f19321a38-kube-api-access-xjngx\") pod \"auto-csr-approver-29555668-xnfj4\" (UID: \"ddbd98cc-c135-485c-9b18-c31f19321a38\") " pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.470073 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:00 crc kubenswrapper[4926]: I0312 18:28:00.966286 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555668-xnfj4"] Mar 12 18:28:00 crc kubenswrapper[4926]: W0312 18:28:00.982334 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddbd98cc_c135_485c_9b18_c31f19321a38.slice/crio-0dc8726e4b9c908068de0db72c97f99f237cbd253cb409610e6f712f3dbcd160 WatchSource:0}: Error finding container 0dc8726e4b9c908068de0db72c97f99f237cbd253cb409610e6f712f3dbcd160: Status 404 returned error can't find the container with id 0dc8726e4b9c908068de0db72c97f99f237cbd253cb409610e6f712f3dbcd160 Mar 12 18:28:01 crc kubenswrapper[4926]: I0312 18:28:01.015555 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" event={"ID":"ddbd98cc-c135-485c-9b18-c31f19321a38","Type":"ContainerStarted","Data":"0dc8726e4b9c908068de0db72c97f99f237cbd253cb409610e6f712f3dbcd160"} Mar 12 18:28:03 crc kubenswrapper[4926]: I0312 18:28:03.037201 4926 generic.go:334] "Generic (PLEG): container finished" podID="ddbd98cc-c135-485c-9b18-c31f19321a38" containerID="74c22411866d45d292c4082e0d4be48c52ad4683049047a45e2de32d9da3478f" exitCode=0 Mar 12 18:28:03 crc kubenswrapper[4926]: I0312 18:28:03.037294 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" event={"ID":"ddbd98cc-c135-485c-9b18-c31f19321a38","Type":"ContainerDied","Data":"74c22411866d45d292c4082e0d4be48c52ad4683049047a45e2de32d9da3478f"} Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.037381 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.092027 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.280547 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8cm8t"] Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.377063 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.506028 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjngx\" (UniqueName: \"kubernetes.io/projected/ddbd98cc-c135-485c-9b18-c31f19321a38-kube-api-access-xjngx\") pod \"ddbd98cc-c135-485c-9b18-c31f19321a38\" (UID: \"ddbd98cc-c135-485c-9b18-c31f19321a38\") " Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.522313 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbd98cc-c135-485c-9b18-c31f19321a38-kube-api-access-xjngx" (OuterVolumeSpecName: "kube-api-access-xjngx") pod "ddbd98cc-c135-485c-9b18-c31f19321a38" (UID: "ddbd98cc-c135-485c-9b18-c31f19321a38"). InnerVolumeSpecName "kube-api-access-xjngx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:28:04 crc kubenswrapper[4926]: I0312 18:28:04.608632 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjngx\" (UniqueName: \"kubernetes.io/projected/ddbd98cc-c135-485c-9b18-c31f19321a38-kube-api-access-xjngx\") on node \"crc\" DevicePath \"\"" Mar 12 18:28:05 crc kubenswrapper[4926]: I0312 18:28:05.066499 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" event={"ID":"ddbd98cc-c135-485c-9b18-c31f19321a38","Type":"ContainerDied","Data":"0dc8726e4b9c908068de0db72c97f99f237cbd253cb409610e6f712f3dbcd160"} Mar 12 18:28:05 crc kubenswrapper[4926]: I0312 18:28:05.066533 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555668-xnfj4" Mar 12 18:28:05 crc kubenswrapper[4926]: I0312 18:28:05.066572 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc8726e4b9c908068de0db72c97f99f237cbd253cb409610e6f712f3dbcd160" Mar 12 18:28:05 crc kubenswrapper[4926]: E0312 18:28:05.262740 4926 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddbd98cc_c135_485c_9b18_c31f19321a38.slice\": RecentStats: unable to find data in memory cache]" Mar 12 18:28:05 crc kubenswrapper[4926]: I0312 18:28:05.446420 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555662-wmwgq"] Mar 12 18:28:05 crc kubenswrapper[4926]: I0312 18:28:05.453522 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555662-wmwgq"] Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.076673 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8cm8t" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" containerID="cri-o://0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9" gracePeriod=2 Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.502527 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3def1951-b6f9-4621-8428-b3e169e34279" path="/var/lib/kubelet/pods/3def1951-b6f9-4621-8428-b3e169e34279/volumes" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.633788 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.749685 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-catalog-content\") pod \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.749799 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-utilities\") pod \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.749967 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnlg2\" (UniqueName: \"kubernetes.io/projected/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-kube-api-access-bnlg2\") pod \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\" (UID: \"f2f7ab38-c8b3-43a3-91f8-32168e7216ab\") " Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.750514 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-utilities" (OuterVolumeSpecName: "utilities") pod "f2f7ab38-c8b3-43a3-91f8-32168e7216ab" (UID: "f2f7ab38-c8b3-43a3-91f8-32168e7216ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.759281 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-kube-api-access-bnlg2" (OuterVolumeSpecName: "kube-api-access-bnlg2") pod "f2f7ab38-c8b3-43a3-91f8-32168e7216ab" (UID: "f2f7ab38-c8b3-43a3-91f8-32168e7216ab"). InnerVolumeSpecName "kube-api-access-bnlg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.852386 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.852425 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnlg2\" (UniqueName: \"kubernetes.io/projected/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-kube-api-access-bnlg2\") on node \"crc\" DevicePath \"\"" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.878877 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2f7ab38-c8b3-43a3-91f8-32168e7216ab" (UID: "f2f7ab38-c8b3-43a3-91f8-32168e7216ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:28:06 crc kubenswrapper[4926]: I0312 18:28:06.954535 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2f7ab38-c8b3-43a3-91f8-32168e7216ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.092283 4926 generic.go:334] "Generic (PLEG): container finished" podID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerID="0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9" exitCode=0 Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.092357 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerDied","Data":"0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9"} Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.092409 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8cm8t" event={"ID":"f2f7ab38-c8b3-43a3-91f8-32168e7216ab","Type":"ContainerDied","Data":"f146c71e8411ea1922f76fefbfa5aefdbf611c51e0b51cca9e5da48e6e433b90"} Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.092480 4926 scope.go:117] "RemoveContainer" containerID="0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.093683 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8cm8t" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.118662 4926 scope.go:117] "RemoveContainer" containerID="6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.157983 4926 scope.go:117] "RemoveContainer" containerID="c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.162582 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8cm8t"] Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.179233 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8cm8t"] Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.192705 4926 scope.go:117] "RemoveContainer" containerID="0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9" Mar 12 18:28:07 crc kubenswrapper[4926]: E0312 18:28:07.193206 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9\": container with ID starting with 0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9 not found: ID does not exist" containerID="0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.193249 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9"} err="failed to get container status \"0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9\": rpc error: code = NotFound desc = could not find container \"0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9\": container with ID starting with 0a9c576f66c558f54f81f42fdf69438a16dadc2a363b3f13d9021793b3966fb9 not found: ID does not exist" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.193278 4926 scope.go:117] "RemoveContainer" containerID="6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124" Mar 12 18:28:07 crc kubenswrapper[4926]: E0312 18:28:07.193582 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124\": container with ID starting with 6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124 not found: ID does not exist" containerID="6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.193608 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124"} err="failed to get container status \"6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124\": rpc error: code = NotFound desc = could not find container \"6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124\": container with ID starting with 6c8be519f6eb6a16345af9a869dccf59f8f37215a19f757ecf6f68becb437124 not found: ID does not exist" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.193622 4926 scope.go:117] "RemoveContainer" containerID="c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045" Mar 12 18:28:07 crc kubenswrapper[4926]: E0312 18:28:07.193908 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045\": container with ID starting with c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045 not found: ID does not exist" containerID="c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045" Mar 12 18:28:07 crc kubenswrapper[4926]: I0312 18:28:07.193932 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045"} err="failed to get container status \"c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045\": rpc error: code = NotFound desc = could not find container \"c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045\": container with ID starting with c7612d6f6ac3e40212f1f47065b076937ee6ab26f5cab2dec5bc30b2d4230045 not found: ID does not exist" Mar 12 18:28:08 crc kubenswrapper[4926]: I0312 18:28:08.517754 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" path="/var/lib/kubelet/pods/f2f7ab38-c8b3-43a3-91f8-32168e7216ab/volumes" Mar 12 18:28:14 crc kubenswrapper[4926]: I0312 18:28:14.643078 4926 scope.go:117] "RemoveContainer" containerID="6e11e8ecbd719cce51e25090959e2c12962023683d089714c24c4c379fbeaa75" Mar 12 18:28:14 crc kubenswrapper[4926]: I0312 18:28:14.695475 4926 scope.go:117] "RemoveContainer" containerID="bf4088b7010b072d07c97127da27603d49f76d54fb965fcf30cdce478bed5715" Mar 12 18:28:26 crc kubenswrapper[4926]: I0312 18:28:26.817222 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:28:26 crc kubenswrapper[4926]: I0312 18:28:26.818682 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:28:26 crc kubenswrapper[4926]: I0312 18:28:26.818756 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:28:26 crc kubenswrapper[4926]: I0312 18:28:26.819506 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"759fd18072cdf8fcc7bc2d92cc950b5720a437d7e4487f5098fffd2244e21cde"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:28:26 crc kubenswrapper[4926]: I0312 18:28:26.819567 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://759fd18072cdf8fcc7bc2d92cc950b5720a437d7e4487f5098fffd2244e21cde" gracePeriod=600 Mar 12 18:28:27 crc kubenswrapper[4926]: I0312 18:28:27.312114 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="759fd18072cdf8fcc7bc2d92cc950b5720a437d7e4487f5098fffd2244e21cde" exitCode=0 Mar 12 18:28:27 crc kubenswrapper[4926]: I0312 18:28:27.312189 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"759fd18072cdf8fcc7bc2d92cc950b5720a437d7e4487f5098fffd2244e21cde"} Mar 12 18:28:27 crc kubenswrapper[4926]: I0312 18:28:27.312505 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e"} Mar 12 18:28:27 crc kubenswrapper[4926]: I0312 18:28:27.312529 4926 scope.go:117] "RemoveContainer" containerID="9728bd6132bdd9ab31a71d0a44779a02f515c39e712bb7cc4f8a85610efe739f" Mar 12 18:29:14 crc kubenswrapper[4926]: I0312 18:29:14.828804 4926 scope.go:117] "RemoveContainer" containerID="334e13044642c4b0a613f2bfccdc8e4e0067f4f734093e62a567052168539e3c" Mar 12 18:29:14 crc kubenswrapper[4926]: I0312 18:29:14.880593 4926 scope.go:117] "RemoveContainer" containerID="f80e900658b9e00858dd34680664f8b822bb4e0eba4897392ac1a1da7732a56c" Mar 12 18:29:14 crc kubenswrapper[4926]: I0312 18:29:14.931968 4926 scope.go:117] "RemoveContainer" containerID="f2d7197b9b3bf88ce380eff3700bdf47487a3209f60cede9380f4a5b74fd2a07" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.629416 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-97nz8"] Mar 12 18:29:34 crc kubenswrapper[4926]: E0312 18:29:34.630330 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.630342 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" Mar 12 18:29:34 crc kubenswrapper[4926]: E0312 18:29:34.630356 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="extract-utilities" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.630362 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="extract-utilities" Mar 12 18:29:34 crc kubenswrapper[4926]: E0312 18:29:34.630370 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="extract-content" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.630376 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="extract-content" Mar 12 18:29:34 crc kubenswrapper[4926]: E0312 18:29:34.630412 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbd98cc-c135-485c-9b18-c31f19321a38" containerName="oc" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.630418 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbd98cc-c135-485c-9b18-c31f19321a38" containerName="oc" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.630606 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f7ab38-c8b3-43a3-91f8-32168e7216ab" containerName="registry-server" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.630624 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbd98cc-c135-485c-9b18-c31f19321a38" containerName="oc" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.632670 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.654838 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97nz8"] Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.728638 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-utilities\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.728991 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcggp\" (UniqueName: \"kubernetes.io/projected/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-kube-api-access-gcggp\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.729071 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-catalog-content\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.830560 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-utilities\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.830675 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcggp\" (UniqueName: \"kubernetes.io/projected/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-kube-api-access-gcggp\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.830709 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-catalog-content\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.831333 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-catalog-content\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.831348 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-utilities\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.852460 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcggp\" (UniqueName: \"kubernetes.io/projected/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-kube-api-access-gcggp\") pod \"certified-operators-97nz8\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:34 crc kubenswrapper[4926]: I0312 18:29:34.961046 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:35 crc kubenswrapper[4926]: I0312 18:29:35.500802 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97nz8"] Mar 12 18:29:35 crc kubenswrapper[4926]: W0312 18:29:35.505899 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa4d5480_0ddc_4ef0_8c78_2dc18fedd103.slice/crio-37034daa38cf52c12ba3c600dedf2eb059e0d574ad95211676b768bc7e090443 WatchSource:0}: Error finding container 37034daa38cf52c12ba3c600dedf2eb059e0d574ad95211676b768bc7e090443: Status 404 returned error can't find the container with id 37034daa38cf52c12ba3c600dedf2eb059e0d574ad95211676b768bc7e090443 Mar 12 18:29:36 crc kubenswrapper[4926]: I0312 18:29:36.068375 4926 generic.go:334] "Generic (PLEG): container finished" podID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerID="7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1" exitCode=0 Mar 12 18:29:36 crc kubenswrapper[4926]: I0312 18:29:36.068471 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerDied","Data":"7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1"} Mar 12 18:29:36 crc kubenswrapper[4926]: I0312 18:29:36.068741 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerStarted","Data":"37034daa38cf52c12ba3c600dedf2eb059e0d574ad95211676b768bc7e090443"} Mar 12 18:29:37 crc kubenswrapper[4926]: I0312 18:29:37.079598 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerStarted","Data":"484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a"} Mar 12 18:29:38 crc kubenswrapper[4926]: I0312 18:29:38.092805 4926 generic.go:334] "Generic (PLEG): container finished" podID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerID="484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a" exitCode=0 Mar 12 18:29:38 crc kubenswrapper[4926]: I0312 18:29:38.092855 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerDied","Data":"484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a"} Mar 12 18:29:39 crc kubenswrapper[4926]: I0312 18:29:39.107234 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerStarted","Data":"af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7"} Mar 12 18:29:39 crc kubenswrapper[4926]: I0312 18:29:39.128508 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-97nz8" podStartSLOduration=2.418181901 podStartE2EDuration="5.12848204s" podCreationTimestamp="2026-03-12 18:29:34 +0000 UTC" firstStartedPulling="2026-03-12 18:29:36.070171425 +0000 UTC m=+1616.438797758" lastFinishedPulling="2026-03-12 18:29:38.780471564 +0000 UTC m=+1619.149097897" observedRunningTime="2026-03-12 18:29:39.128363927 +0000 UTC m=+1619.496990310" watchObservedRunningTime="2026-03-12 18:29:39.12848204 +0000 UTC m=+1619.497108373" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.014370 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5k9gj"] Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.016578 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.029719 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5k9gj"] Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.141157 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0817ed2e-acdd-41b8-b210-84281525839f-catalog-content\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.141203 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjqw\" (UniqueName: \"kubernetes.io/projected/0817ed2e-acdd-41b8-b210-84281525839f-kube-api-access-lbjqw\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.141331 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0817ed2e-acdd-41b8-b210-84281525839f-utilities\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.243314 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0817ed2e-acdd-41b8-b210-84281525839f-utilities\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.243405 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0817ed2e-acdd-41b8-b210-84281525839f-catalog-content\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.243429 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjqw\" (UniqueName: \"kubernetes.io/projected/0817ed2e-acdd-41b8-b210-84281525839f-kube-api-access-lbjqw\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.243800 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0817ed2e-acdd-41b8-b210-84281525839f-utilities\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.244169 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0817ed2e-acdd-41b8-b210-84281525839f-catalog-content\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.265736 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjqw\" (UniqueName: \"kubernetes.io/projected/0817ed2e-acdd-41b8-b210-84281525839f-kube-api-access-lbjqw\") pod \"community-operators-5k9gj\" (UID: \"0817ed2e-acdd-41b8-b210-84281525839f\") " pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.335259 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:40 crc kubenswrapper[4926]: I0312 18:29:40.690379 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5k9gj"] Mar 12 18:29:41 crc kubenswrapper[4926]: I0312 18:29:41.132056 4926 generic.go:334] "Generic (PLEG): container finished" podID="0817ed2e-acdd-41b8-b210-84281525839f" containerID="8e93fac4e2765b9277b5b265a7c08055ab7f2b0fb83f9631b99c08b9f04874cf" exitCode=0 Mar 12 18:29:41 crc kubenswrapper[4926]: I0312 18:29:41.132114 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5k9gj" event={"ID":"0817ed2e-acdd-41b8-b210-84281525839f","Type":"ContainerDied","Data":"8e93fac4e2765b9277b5b265a7c08055ab7f2b0fb83f9631b99c08b9f04874cf"} Mar 12 18:29:41 crc kubenswrapper[4926]: I0312 18:29:41.132174 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5k9gj" event={"ID":"0817ed2e-acdd-41b8-b210-84281525839f","Type":"ContainerStarted","Data":"838aa39e9c3fa588ebc1ec1db1d9fae71627a58effba2a45c640d72c9ec4e6aa"} Mar 12 18:29:44 crc kubenswrapper[4926]: I0312 18:29:44.962153 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:44 crc kubenswrapper[4926]: I0312 18:29:44.962659 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:45 crc kubenswrapper[4926]: I0312 18:29:45.006721 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:45 crc kubenswrapper[4926]: I0312 18:29:45.231977 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:45 crc kubenswrapper[4926]: I0312 18:29:45.291616 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97nz8"] Mar 12 18:29:46 crc kubenswrapper[4926]: I0312 18:29:46.177866 4926 generic.go:334] "Generic (PLEG): container finished" podID="0817ed2e-acdd-41b8-b210-84281525839f" containerID="9f135147b5771f7de5d2ccfbfc2d02a3d2a2771ee65bbf1e8e1f0a86854a1e59" exitCode=0 Mar 12 18:29:46 crc kubenswrapper[4926]: I0312 18:29:46.177937 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5k9gj" event={"ID":"0817ed2e-acdd-41b8-b210-84281525839f","Type":"ContainerDied","Data":"9f135147b5771f7de5d2ccfbfc2d02a3d2a2771ee65bbf1e8e1f0a86854a1e59"} Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.191630 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-97nz8" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="registry-server" containerID="cri-o://af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7" gracePeriod=2 Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.192163 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5k9gj" event={"ID":"0817ed2e-acdd-41b8-b210-84281525839f","Type":"ContainerStarted","Data":"7abe901bddf6d43393bae272cbfbd5c3f58b73de8b29e2c5d9741cbc83e0fbc0"} Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.240582 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5k9gj" podStartSLOduration=2.802052173 podStartE2EDuration="8.240560076s" podCreationTimestamp="2026-03-12 18:29:39 +0000 UTC" firstStartedPulling="2026-03-12 18:29:41.13476528 +0000 UTC m=+1621.503391613" lastFinishedPulling="2026-03-12 18:29:46.573273143 +0000 UTC m=+1626.941899516" observedRunningTime="2026-03-12 18:29:47.211507974 +0000 UTC m=+1627.580134337" watchObservedRunningTime="2026-03-12 18:29:47.240560076 +0000 UTC m=+1627.609186409" Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.665673 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.816688 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-catalog-content\") pod \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.817074 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-utilities\") pod \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.817317 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcggp\" (UniqueName: \"kubernetes.io/projected/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-kube-api-access-gcggp\") pod \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\" (UID: \"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103\") " Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.817916 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-utilities" (OuterVolumeSpecName: "utilities") pod "aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" (UID: "aa4d5480-0ddc-4ef0-8c78-2dc18fedd103"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.823124 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-kube-api-access-gcggp" (OuterVolumeSpecName: "kube-api-access-gcggp") pod "aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" (UID: "aa4d5480-0ddc-4ef0-8c78-2dc18fedd103"). InnerVolumeSpecName "kube-api-access-gcggp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.920660 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.920705 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcggp\" (UniqueName: \"kubernetes.io/projected/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-kube-api-access-gcggp\") on node \"crc\" DevicePath \"\"" Mar 12 18:29:47 crc kubenswrapper[4926]: I0312 18:29:47.992399 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" (UID: "aa4d5480-0ddc-4ef0-8c78-2dc18fedd103"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.023215 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.210174 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97nz8" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.210196 4926 generic.go:334] "Generic (PLEG): container finished" podID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerID="af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7" exitCode=0 Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.210247 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerDied","Data":"af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7"} Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.211518 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97nz8" event={"ID":"aa4d5480-0ddc-4ef0-8c78-2dc18fedd103","Type":"ContainerDied","Data":"37034daa38cf52c12ba3c600dedf2eb059e0d574ad95211676b768bc7e090443"} Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.211543 4926 scope.go:117] "RemoveContainer" containerID="af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.242591 4926 scope.go:117] "RemoveContainer" containerID="484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.258368 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97nz8"] Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.269877 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-97nz8"] Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.284778 4926 scope.go:117] "RemoveContainer" containerID="7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.330617 4926 scope.go:117] "RemoveContainer" containerID="af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7" Mar 12 18:29:48 crc kubenswrapper[4926]: E0312 18:29:48.333245 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7\": container with ID starting with af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7 not found: ID does not exist" containerID="af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.333291 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7"} err="failed to get container status \"af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7\": rpc error: code = NotFound desc = could not find container \"af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7\": container with ID starting with af770418be95ad042053034dde6f8e0b355bb6b982997db4c758ea7364966da7 not found: ID does not exist" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.333319 4926 scope.go:117] "RemoveContainer" containerID="484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a" Mar 12 18:29:48 crc kubenswrapper[4926]: E0312 18:29:48.333655 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a\": container with ID starting with 484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a not found: ID does not exist" containerID="484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.334563 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a"} err="failed to get container status \"484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a\": rpc error: code = NotFound desc = could not find container \"484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a\": container with ID starting with 484f44600934344af2c346f90fc870cde957b1b9ae36fc19a8cb8fd79089750a not found: ID does not exist" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.334598 4926 scope.go:117] "RemoveContainer" containerID="7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1" Mar 12 18:29:48 crc kubenswrapper[4926]: E0312 18:29:48.334894 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1\": container with ID starting with 7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1 not found: ID does not exist" containerID="7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.334923 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1"} err="failed to get container status \"7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1\": rpc error: code = NotFound desc = could not find container \"7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1\": container with ID starting with 7d5874fb1e97421d1554dc44919f59f132287bb2c204fb0390ffc107768b89d1 not found: ID does not exist" Mar 12 18:29:48 crc kubenswrapper[4926]: I0312 18:29:48.503985 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" path="/var/lib/kubelet/pods/aa4d5480-0ddc-4ef0-8c78-2dc18fedd103/volumes" Mar 12 18:29:50 crc kubenswrapper[4926]: I0312 18:29:50.335525 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:50 crc kubenswrapper[4926]: I0312 18:29:50.335966 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:50 crc kubenswrapper[4926]: I0312 18:29:50.382573 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:51 crc kubenswrapper[4926]: I0312 18:29:51.310673 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5k9gj" Mar 12 18:29:51 crc kubenswrapper[4926]: I0312 18:29:51.493937 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5k9gj"] Mar 12 18:29:51 crc kubenswrapper[4926]: I0312 18:29:51.653123 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89rth"] Mar 12 18:29:51 crc kubenswrapper[4926]: I0312 18:29:51.653422 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89rth" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="registry-server" containerID="cri-o://b6bb1201cf8704c1c5218ae5ad6eb53e93f35257e13f18d1b638bfe3b2da6322" gracePeriod=2 Mar 12 18:29:52 crc kubenswrapper[4926]: I0312 18:29:52.266177 4926 generic.go:334] "Generic (PLEG): container finished" podID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerID="b6bb1201cf8704c1c5218ae5ad6eb53e93f35257e13f18d1b638bfe3b2da6322" exitCode=0 Mar 12 18:29:52 crc kubenswrapper[4926]: I0312 18:29:52.266469 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerDied","Data":"b6bb1201cf8704c1c5218ae5ad6eb53e93f35257e13f18d1b638bfe3b2da6322"} Mar 12 18:29:52 crc kubenswrapper[4926]: I0312 18:29:52.907171 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.020561 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-catalog-content\") pod \"8ed429f6-2923-42c2-a3b5-402c3dff4858\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.020733 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-utilities\") pod \"8ed429f6-2923-42c2-a3b5-402c3dff4858\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.020844 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5zfs\" (UniqueName: \"kubernetes.io/projected/8ed429f6-2923-42c2-a3b5-402c3dff4858-kube-api-access-c5zfs\") pod \"8ed429f6-2923-42c2-a3b5-402c3dff4858\" (UID: \"8ed429f6-2923-42c2-a3b5-402c3dff4858\") " Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.022222 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-utilities" (OuterVolumeSpecName: "utilities") pod "8ed429f6-2923-42c2-a3b5-402c3dff4858" (UID: "8ed429f6-2923-42c2-a3b5-402c3dff4858"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.029564 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed429f6-2923-42c2-a3b5-402c3dff4858-kube-api-access-c5zfs" (OuterVolumeSpecName: "kube-api-access-c5zfs") pod "8ed429f6-2923-42c2-a3b5-402c3dff4858" (UID: "8ed429f6-2923-42c2-a3b5-402c3dff4858"). InnerVolumeSpecName "kube-api-access-c5zfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.090265 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ed429f6-2923-42c2-a3b5-402c3dff4858" (UID: "8ed429f6-2923-42c2-a3b5-402c3dff4858"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.123265 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.123296 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ed429f6-2923-42c2-a3b5-402c3dff4858-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.123307 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5zfs\" (UniqueName: \"kubernetes.io/projected/8ed429f6-2923-42c2-a3b5-402c3dff4858-kube-api-access-c5zfs\") on node \"crc\" DevicePath \"\"" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.278547 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89rth" event={"ID":"8ed429f6-2923-42c2-a3b5-402c3dff4858","Type":"ContainerDied","Data":"75b5fa3c244e1a8120e460f27209d8d4bc13d27e6ea5e2efce3f0acb11f95356"} Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.278587 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89rth" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.278616 4926 scope.go:117] "RemoveContainer" containerID="b6bb1201cf8704c1c5218ae5ad6eb53e93f35257e13f18d1b638bfe3b2da6322" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.302605 4926 scope.go:117] "RemoveContainer" containerID="2efb7a52879f66525ffb82d5cb820bfe29543aedd6332a889b48b179b150edf8" Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.315692 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89rth"] Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.328595 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89rth"] Mar 12 18:29:53 crc kubenswrapper[4926]: I0312 18:29:53.333787 4926 scope.go:117] "RemoveContainer" containerID="b4a2b0d2b334eaa88bb6c32b357510979e7382c6582dd687f2155b5f20b07067" Mar 12 18:29:54 crc kubenswrapper[4926]: I0312 18:29:54.507499 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" path="/var/lib/kubelet/pods/8ed429f6-2923-42c2-a3b5-402c3dff4858/volumes" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.062251 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-29mgs"] Mar 12 18:29:59 crc kubenswrapper[4926]: E0312 18:29:59.063981 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="registry-server" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064004 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="registry-server" Mar 12 18:29:59 crc kubenswrapper[4926]: E0312 18:29:59.064021 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="extract-content" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064027 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="extract-content" Mar 12 18:29:59 crc kubenswrapper[4926]: E0312 18:29:59.064075 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="extract-content" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064082 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="extract-content" Mar 12 18:29:59 crc kubenswrapper[4926]: E0312 18:29:59.064106 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="extract-utilities" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064116 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="extract-utilities" Mar 12 18:29:59 crc kubenswrapper[4926]: E0312 18:29:59.064131 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="registry-server" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064139 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="registry-server" Mar 12 18:29:59 crc kubenswrapper[4926]: E0312 18:29:59.064154 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="extract-utilities" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064162 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="extract-utilities" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064425 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4d5480-0ddc-4ef0-8c78-2dc18fedd103" containerName="registry-server" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.064479 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed429f6-2923-42c2-a3b5-402c3dff4858" containerName="registry-server" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.066317 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.090575 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mgs"] Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.265781 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g5dw\" (UniqueName: \"kubernetes.io/projected/86dc3218-ffd0-4d12-af08-2c1323f6b670-kube-api-access-5g5dw\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.265981 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-catalog-content\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.266076 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-utilities\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.368112 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-catalog-content\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.368218 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-utilities\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.368271 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g5dw\" (UniqueName: \"kubernetes.io/projected/86dc3218-ffd0-4d12-af08-2c1323f6b670-kube-api-access-5g5dw\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.368744 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-catalog-content\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.368782 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-utilities\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.391348 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g5dw\" (UniqueName: \"kubernetes.io/projected/86dc3218-ffd0-4d12-af08-2c1323f6b670-kube-api-access-5g5dw\") pod \"redhat-marketplace-29mgs\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.392164 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:29:59 crc kubenswrapper[4926]: I0312 18:29:59.920652 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mgs"] Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.171429 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555670-v8lqv"] Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.173847 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.185169 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx"] Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.186518 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.186660 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.186796 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.186796 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.187922 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/daa768ee-1644-40fa-8f52-790b28d9bb74-kube-api-access-gxtgk\") pod \"auto-csr-approver-29555670-v8lqv\" (UID: \"daa768ee-1644-40fa-8f52-790b28d9bb74\") " pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.191868 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.192456 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.195348 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555670-v8lqv"] Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.206288 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx"] Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.289491 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/daa768ee-1644-40fa-8f52-790b28d9bb74-kube-api-access-gxtgk\") pod \"auto-csr-approver-29555670-v8lqv\" (UID: \"daa768ee-1644-40fa-8f52-790b28d9bb74\") " pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.289590 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a960be2-da37-4f11-a8ba-d2c056550139-config-volume\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.289717 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a960be2-da37-4f11-a8ba-d2c056550139-secret-volume\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.289792 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwtbg\" (UniqueName: \"kubernetes.io/projected/6a960be2-da37-4f11-a8ba-d2c056550139-kube-api-access-jwtbg\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.312510 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/daa768ee-1644-40fa-8f52-790b28d9bb74-kube-api-access-gxtgk\") pod \"auto-csr-approver-29555670-v8lqv\" (UID: \"daa768ee-1644-40fa-8f52-790b28d9bb74\") " pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.361233 4926 generic.go:334] "Generic (PLEG): container finished" podID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerID="a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11" exitCode=0 Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.361313 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerDied","Data":"a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11"} Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.361361 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerStarted","Data":"7e4eb6cfcd4ff4c95aa64507913fece03b2186570bccad32ac581e160aa57575"} Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.394300 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a960be2-da37-4f11-a8ba-d2c056550139-secret-volume\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.395365 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwtbg\" (UniqueName: \"kubernetes.io/projected/6a960be2-da37-4f11-a8ba-d2c056550139-kube-api-access-jwtbg\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.395547 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a960be2-da37-4f11-a8ba-d2c056550139-config-volume\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.396937 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a960be2-da37-4f11-a8ba-d2c056550139-config-volume\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.400107 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a960be2-da37-4f11-a8ba-d2c056550139-secret-volume\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.416178 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwtbg\" (UniqueName: \"kubernetes.io/projected/6a960be2-da37-4f11-a8ba-d2c056550139-kube-api-access-jwtbg\") pod \"collect-profiles-29555670-fm7kx\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.503786 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.517597 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:00 crc kubenswrapper[4926]: I0312 18:30:00.994686 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555670-v8lqv"] Mar 12 18:30:01 crc kubenswrapper[4926]: W0312 18:30:01.001057 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa768ee_1644_40fa_8f52_790b28d9bb74.slice/crio-1249afae3d68fc8aea5725888627c7574591fd0ccbc7538d07300e420c6955ba WatchSource:0}: Error finding container 1249afae3d68fc8aea5725888627c7574591fd0ccbc7538d07300e420c6955ba: Status 404 returned error can't find the container with id 1249afae3d68fc8aea5725888627c7574591fd0ccbc7538d07300e420c6955ba Mar 12 18:30:01 crc kubenswrapper[4926]: W0312 18:30:01.089983 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a960be2_da37_4f11_a8ba_d2c056550139.slice/crio-a6a819de11d45bbd132f324f357d9013738950f71532f8e7b0e6fea145687b51 WatchSource:0}: Error finding container a6a819de11d45bbd132f324f357d9013738950f71532f8e7b0e6fea145687b51: Status 404 returned error can't find the container with id a6a819de11d45bbd132f324f357d9013738950f71532f8e7b0e6fea145687b51 Mar 12 18:30:01 crc kubenswrapper[4926]: I0312 18:30:01.092307 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx"] Mar 12 18:30:01 crc kubenswrapper[4926]: I0312 18:30:01.375601 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" event={"ID":"6a960be2-da37-4f11-a8ba-d2c056550139","Type":"ContainerStarted","Data":"fc548ee0da579e49aeea484208cb8bc3f74184e26c19a3b70e464f8de8ed58e9"} Mar 12 18:30:01 crc kubenswrapper[4926]: I0312 18:30:01.375931 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" event={"ID":"6a960be2-da37-4f11-a8ba-d2c056550139","Type":"ContainerStarted","Data":"a6a819de11d45bbd132f324f357d9013738950f71532f8e7b0e6fea145687b51"} Mar 12 18:30:01 crc kubenswrapper[4926]: I0312 18:30:01.377698 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" event={"ID":"daa768ee-1644-40fa-8f52-790b28d9bb74","Type":"ContainerStarted","Data":"1249afae3d68fc8aea5725888627c7574591fd0ccbc7538d07300e420c6955ba"} Mar 12 18:30:01 crc kubenswrapper[4926]: I0312 18:30:01.383387 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerStarted","Data":"cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345"} Mar 12 18:30:01 crc kubenswrapper[4926]: I0312 18:30:01.396280 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" podStartSLOduration=1.396258495 podStartE2EDuration="1.396258495s" podCreationTimestamp="2026-03-12 18:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 18:30:01.391609251 +0000 UTC m=+1641.760235584" watchObservedRunningTime="2026-03-12 18:30:01.396258495 +0000 UTC m=+1641.764884828" Mar 12 18:30:02 crc kubenswrapper[4926]: I0312 18:30:02.400782 4926 generic.go:334] "Generic (PLEG): container finished" podID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerID="cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345" exitCode=0 Mar 12 18:30:02 crc kubenswrapper[4926]: I0312 18:30:02.400902 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerDied","Data":"cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345"} Mar 12 18:30:02 crc kubenswrapper[4926]: I0312 18:30:02.409159 4926 generic.go:334] "Generic (PLEG): container finished" podID="6a960be2-da37-4f11-a8ba-d2c056550139" containerID="fc548ee0da579e49aeea484208cb8bc3f74184e26c19a3b70e464f8de8ed58e9" exitCode=0 Mar 12 18:30:02 crc kubenswrapper[4926]: I0312 18:30:02.409233 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" event={"ID":"6a960be2-da37-4f11-a8ba-d2c056550139","Type":"ContainerDied","Data":"fc548ee0da579e49aeea484208cb8bc3f74184e26c19a3b70e464f8de8ed58e9"} Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.434522 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerStarted","Data":"8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33"} Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.465459 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-29mgs" podStartSLOduration=1.946480881 podStartE2EDuration="4.465426322s" podCreationTimestamp="2026-03-12 18:29:59 +0000 UTC" firstStartedPulling="2026-03-12 18:30:00.364167744 +0000 UTC m=+1640.732794107" lastFinishedPulling="2026-03-12 18:30:02.883113215 +0000 UTC m=+1643.251739548" observedRunningTime="2026-03-12 18:30:03.464494414 +0000 UTC m=+1643.833120797" watchObservedRunningTime="2026-03-12 18:30:03.465426322 +0000 UTC m=+1643.834052645" Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.830786 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.977863 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a960be2-da37-4f11-a8ba-d2c056550139-secret-volume\") pod \"6a960be2-da37-4f11-a8ba-d2c056550139\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.977932 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a960be2-da37-4f11-a8ba-d2c056550139-config-volume\") pod \"6a960be2-da37-4f11-a8ba-d2c056550139\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.977963 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwtbg\" (UniqueName: \"kubernetes.io/projected/6a960be2-da37-4f11-a8ba-d2c056550139-kube-api-access-jwtbg\") pod \"6a960be2-da37-4f11-a8ba-d2c056550139\" (UID: \"6a960be2-da37-4f11-a8ba-d2c056550139\") " Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.980594 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a960be2-da37-4f11-a8ba-d2c056550139-config-volume" (OuterVolumeSpecName: "config-volume") pod "6a960be2-da37-4f11-a8ba-d2c056550139" (UID: "6a960be2-da37-4f11-a8ba-d2c056550139"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 18:30:03 crc kubenswrapper[4926]: I0312 18:30:03.989665 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a960be2-da37-4f11-a8ba-d2c056550139-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6a960be2-da37-4f11-a8ba-d2c056550139" (UID: "6a960be2-da37-4f11-a8ba-d2c056550139"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.022715 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a960be2-da37-4f11-a8ba-d2c056550139-kube-api-access-jwtbg" (OuterVolumeSpecName: "kube-api-access-jwtbg") pod "6a960be2-da37-4f11-a8ba-d2c056550139" (UID: "6a960be2-da37-4f11-a8ba-d2c056550139"). InnerVolumeSpecName "kube-api-access-jwtbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.080628 4926 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6a960be2-da37-4f11-a8ba-d2c056550139-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.080688 4926 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a960be2-da37-4f11-a8ba-d2c056550139-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.080701 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwtbg\" (UniqueName: \"kubernetes.io/projected/6a960be2-da37-4f11-a8ba-d2c056550139-kube-api-access-jwtbg\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.443145 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" event={"ID":"6a960be2-da37-4f11-a8ba-d2c056550139","Type":"ContainerDied","Data":"a6a819de11d45bbd132f324f357d9013738950f71532f8e7b0e6fea145687b51"} Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.443557 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6a819de11d45bbd132f324f357d9013738950f71532f8e7b0e6fea145687b51" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.443205 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555670-fm7kx" Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.445729 4926 generic.go:334] "Generic (PLEG): container finished" podID="daa768ee-1644-40fa-8f52-790b28d9bb74" containerID="799ca781836933eed5f381ea7d6b69d2905b6867e2a978eb9f38bd2b549ed6cb" exitCode=0 Mar 12 18:30:04 crc kubenswrapper[4926]: I0312 18:30:04.446603 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" event={"ID":"daa768ee-1644-40fa-8f52-790b28d9bb74","Type":"ContainerDied","Data":"799ca781836933eed5f381ea7d6b69d2905b6867e2a978eb9f38bd2b549ed6cb"} Mar 12 18:30:05 crc kubenswrapper[4926]: I0312 18:30:05.833139 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:05 crc kubenswrapper[4926]: I0312 18:30:05.918385 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/daa768ee-1644-40fa-8f52-790b28d9bb74-kube-api-access-gxtgk\") pod \"daa768ee-1644-40fa-8f52-790b28d9bb74\" (UID: \"daa768ee-1644-40fa-8f52-790b28d9bb74\") " Mar 12 18:30:05 crc kubenswrapper[4926]: I0312 18:30:05.926270 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa768ee-1644-40fa-8f52-790b28d9bb74-kube-api-access-gxtgk" (OuterVolumeSpecName: "kube-api-access-gxtgk") pod "daa768ee-1644-40fa-8f52-790b28d9bb74" (UID: "daa768ee-1644-40fa-8f52-790b28d9bb74"). InnerVolumeSpecName "kube-api-access-gxtgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:30:06 crc kubenswrapper[4926]: I0312 18:30:06.021140 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtgk\" (UniqueName: \"kubernetes.io/projected/daa768ee-1644-40fa-8f52-790b28d9bb74-kube-api-access-gxtgk\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:06 crc kubenswrapper[4926]: I0312 18:30:06.466628 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" event={"ID":"daa768ee-1644-40fa-8f52-790b28d9bb74","Type":"ContainerDied","Data":"1249afae3d68fc8aea5725888627c7574591fd0ccbc7538d07300e420c6955ba"} Mar 12 18:30:06 crc kubenswrapper[4926]: I0312 18:30:06.466941 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1249afae3d68fc8aea5725888627c7574591fd0ccbc7538d07300e420c6955ba" Mar 12 18:30:06 crc kubenswrapper[4926]: I0312 18:30:06.466770 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555670-v8lqv" Mar 12 18:30:06 crc kubenswrapper[4926]: I0312 18:30:06.913213 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555664-xqlj7"] Mar 12 18:30:06 crc kubenswrapper[4926]: I0312 18:30:06.923820 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555664-xqlj7"] Mar 12 18:30:08 crc kubenswrapper[4926]: I0312 18:30:08.504217 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a93b50-2038-4bf8-8c5f-bc77148d55f8" path="/var/lib/kubelet/pods/d0a93b50-2038-4bf8-8c5f-bc77148d55f8/volumes" Mar 12 18:30:09 crc kubenswrapper[4926]: I0312 18:30:09.392944 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:30:09 crc kubenswrapper[4926]: I0312 18:30:09.393114 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:30:09 crc kubenswrapper[4926]: I0312 18:30:09.454302 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:30:09 crc kubenswrapper[4926]: I0312 18:30:09.567211 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:30:09 crc kubenswrapper[4926]: I0312 18:30:09.688830 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mgs"] Mar 12 18:30:11 crc kubenswrapper[4926]: I0312 18:30:11.514068 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-29mgs" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="registry-server" containerID="cri-o://8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33" gracePeriod=2 Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.527143 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.528187 4926 generic.go:334] "Generic (PLEG): container finished" podID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerID="8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33" exitCode=0 Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.528231 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerDied","Data":"8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33"} Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.528258 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mgs" event={"ID":"86dc3218-ffd0-4d12-af08-2c1323f6b670","Type":"ContainerDied","Data":"7e4eb6cfcd4ff4c95aa64507913fece03b2186570bccad32ac581e160aa57575"} Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.528275 4926 scope.go:117] "RemoveContainer" containerID="8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.548746 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-catalog-content\") pod \"86dc3218-ffd0-4d12-af08-2c1323f6b670\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.548824 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-utilities\") pod \"86dc3218-ffd0-4d12-af08-2c1323f6b670\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.548983 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g5dw\" (UniqueName: \"kubernetes.io/projected/86dc3218-ffd0-4d12-af08-2c1323f6b670-kube-api-access-5g5dw\") pod \"86dc3218-ffd0-4d12-af08-2c1323f6b670\" (UID: \"86dc3218-ffd0-4d12-af08-2c1323f6b670\") " Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.551623 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-utilities" (OuterVolumeSpecName: "utilities") pod "86dc3218-ffd0-4d12-af08-2c1323f6b670" (UID: "86dc3218-ffd0-4d12-af08-2c1323f6b670"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.557799 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dc3218-ffd0-4d12-af08-2c1323f6b670-kube-api-access-5g5dw" (OuterVolumeSpecName: "kube-api-access-5g5dw") pod "86dc3218-ffd0-4d12-af08-2c1323f6b670" (UID: "86dc3218-ffd0-4d12-af08-2c1323f6b670"). InnerVolumeSpecName "kube-api-access-5g5dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.564272 4926 scope.go:117] "RemoveContainer" containerID="cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.576852 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86dc3218-ffd0-4d12-af08-2c1323f6b670" (UID: "86dc3218-ffd0-4d12-af08-2c1323f6b670"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.637398 4926 scope.go:117] "RemoveContainer" containerID="a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.654259 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g5dw\" (UniqueName: \"kubernetes.io/projected/86dc3218-ffd0-4d12-af08-2c1323f6b670-kube-api-access-5g5dw\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.654303 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.654318 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86dc3218-ffd0-4d12-af08-2c1323f6b670-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.677397 4926 scope.go:117] "RemoveContainer" containerID="8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33" Mar 12 18:30:12 crc kubenswrapper[4926]: E0312 18:30:12.678803 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33\": container with ID starting with 8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33 not found: ID does not exist" containerID="8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.678836 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33"} err="failed to get container status \"8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33\": rpc error: code = NotFound desc = could not find container \"8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33\": container with ID starting with 8838f3b187da1dc89cc4a1ec3bc43e78dc94dbbd93ce731af57b9f4fa5832c33 not found: ID does not exist" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.678855 4926 scope.go:117] "RemoveContainer" containerID="cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345" Mar 12 18:30:12 crc kubenswrapper[4926]: E0312 18:30:12.679092 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345\": container with ID starting with cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345 not found: ID does not exist" containerID="cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.679119 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345"} err="failed to get container status \"cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345\": rpc error: code = NotFound desc = could not find container \"cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345\": container with ID starting with cb5751d1b0b2b4518890d4be951952deff5a2671bf075c4b9c9dd65a74f33345 not found: ID does not exist" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.679137 4926 scope.go:117] "RemoveContainer" containerID="a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11" Mar 12 18:30:12 crc kubenswrapper[4926]: E0312 18:30:12.679394 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11\": container with ID starting with a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11 not found: ID does not exist" containerID="a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11" Mar 12 18:30:12 crc kubenswrapper[4926]: I0312 18:30:12.679428 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11"} err="failed to get container status \"a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11\": rpc error: code = NotFound desc = could not find container \"a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11\": container with ID starting with a0856534fb16cefe399c293a5eafbda265f3f89c4fdfbdb5141264cad4c82e11 not found: ID does not exist" Mar 12 18:30:13 crc kubenswrapper[4926]: I0312 18:30:13.540534 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mgs" Mar 12 18:30:13 crc kubenswrapper[4926]: I0312 18:30:13.586367 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mgs"] Mar 12 18:30:13 crc kubenswrapper[4926]: I0312 18:30:13.598278 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mgs"] Mar 12 18:30:14 crc kubenswrapper[4926]: I0312 18:30:14.505028 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" path="/var/lib/kubelet/pods/86dc3218-ffd0-4d12-af08-2c1323f6b670/volumes" Mar 12 18:30:15 crc kubenswrapper[4926]: I0312 18:30:15.058361 4926 scope.go:117] "RemoveContainer" containerID="e0b6e5ac15bb107790c6fb8dd27667c6693088633c203e8fcc884a1e78a79525" Mar 12 18:30:15 crc kubenswrapper[4926]: I0312 18:30:15.121899 4926 scope.go:117] "RemoveContainer" containerID="3e365f4cb50835b1180ff7c19d7fc397df5fff675446efdb00f5237496ddec2b" Mar 12 18:30:15 crc kubenswrapper[4926]: I0312 18:30:15.150874 4926 scope.go:117] "RemoveContainer" containerID="48ea775ffdf8bca7487b51cdb7c8d987f732c9702e1c95ad138a8eebbbab7c90" Mar 12 18:30:15 crc kubenswrapper[4926]: I0312 18:30:15.179388 4926 scope.go:117] "RemoveContainer" containerID="293a2572fcb8b2d40b187537e1c594705c232add90d5f330faf45d6e18e49dd0" Mar 12 18:30:15 crc kubenswrapper[4926]: I0312 18:30:15.218563 4926 scope.go:117] "RemoveContainer" containerID="a12ed3e178db267afb04649b5718c1d2bcc9a78faa88e5e9519c5b32b47a1362" Mar 12 18:30:30 crc kubenswrapper[4926]: I0312 18:30:30.723506 4926 generic.go:334] "Generic (PLEG): container finished" podID="75a3208b-42f5-412e-a503-ac328f7d9967" containerID="6e0e0f76508b934742dbabb4b94eeab9a4fa625bd527c90d90fa0cda46ddea4a" exitCode=0 Mar 12 18:30:30 crc kubenswrapper[4926]: I0312 18:30:30.723588 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" event={"ID":"75a3208b-42f5-412e-a503-ac328f7d9967","Type":"ContainerDied","Data":"6e0e0f76508b934742dbabb4b94eeab9a4fa625bd527c90d90fa0cda46ddea4a"} Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.172774 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.347728 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpczd\" (UniqueName: \"kubernetes.io/projected/75a3208b-42f5-412e-a503-ac328f7d9967-kube-api-access-mpczd\") pod \"75a3208b-42f5-412e-a503-ac328f7d9967\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.347823 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-ssh-key-openstack-edpm-ipam\") pod \"75a3208b-42f5-412e-a503-ac328f7d9967\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.347897 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-inventory\") pod \"75a3208b-42f5-412e-a503-ac328f7d9967\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.347956 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-bootstrap-combined-ca-bundle\") pod \"75a3208b-42f5-412e-a503-ac328f7d9967\" (UID: \"75a3208b-42f5-412e-a503-ac328f7d9967\") " Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.354030 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "75a3208b-42f5-412e-a503-ac328f7d9967" (UID: "75a3208b-42f5-412e-a503-ac328f7d9967"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.354825 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a3208b-42f5-412e-a503-ac328f7d9967-kube-api-access-mpczd" (OuterVolumeSpecName: "kube-api-access-mpczd") pod "75a3208b-42f5-412e-a503-ac328f7d9967" (UID: "75a3208b-42f5-412e-a503-ac328f7d9967"). InnerVolumeSpecName "kube-api-access-mpczd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.381405 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-inventory" (OuterVolumeSpecName: "inventory") pod "75a3208b-42f5-412e-a503-ac328f7d9967" (UID: "75a3208b-42f5-412e-a503-ac328f7d9967"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.400896 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "75a3208b-42f5-412e-a503-ac328f7d9967" (UID: "75a3208b-42f5-412e-a503-ac328f7d9967"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.449774 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.449808 4926 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.449818 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpczd\" (UniqueName: \"kubernetes.io/projected/75a3208b-42f5-412e-a503-ac328f7d9967-kube-api-access-mpczd\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.449827 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/75a3208b-42f5-412e-a503-ac328f7d9967-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.748211 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" event={"ID":"75a3208b-42f5-412e-a503-ac328f7d9967","Type":"ContainerDied","Data":"beacf7f2c4ecf74d1f45669096c28cb46b36b877d88778fb003f1ce1efb01ae4"} Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.748258 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beacf7f2c4ecf74d1f45669096c28cb46b36b877d88778fb003f1ce1efb01ae4" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.748407 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.838930 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq"] Mar 12 18:30:32 crc kubenswrapper[4926]: E0312 18:30:32.839511 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="extract-utilities" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839538 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="extract-utilities" Mar 12 18:30:32 crc kubenswrapper[4926]: E0312 18:30:32.839557 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="registry-server" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839567 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="registry-server" Mar 12 18:30:32 crc kubenswrapper[4926]: E0312 18:30:32.839608 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="extract-content" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839618 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="extract-content" Mar 12 18:30:32 crc kubenswrapper[4926]: E0312 18:30:32.839637 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a3208b-42f5-412e-a503-ac328f7d9967" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839646 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a3208b-42f5-412e-a503-ac328f7d9967" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 18:30:32 crc kubenswrapper[4926]: E0312 18:30:32.839666 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa768ee-1644-40fa-8f52-790b28d9bb74" containerName="oc" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839675 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa768ee-1644-40fa-8f52-790b28d9bb74" containerName="oc" Mar 12 18:30:32 crc kubenswrapper[4926]: E0312 18:30:32.839693 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a960be2-da37-4f11-a8ba-d2c056550139" containerName="collect-profiles" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839701 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a960be2-da37-4f11-a8ba-d2c056550139" containerName="collect-profiles" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839930 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a960be2-da37-4f11-a8ba-d2c056550139" containerName="collect-profiles" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839956 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a3208b-42f5-412e-a503-ac328f7d9967" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839977 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa768ee-1644-40fa-8f52-790b28d9bb74" containerName="oc" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.839998 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dc3218-ffd0-4d12-af08-2c1323f6b670" containerName="registry-server" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.840857 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.843577 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.843719 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.843738 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.846902 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq"] Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.849237 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.963373 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5q9m\" (UniqueName: \"kubernetes.io/projected/eab3d0c2-5edc-4657-928f-52a87de2293a-kube-api-access-k5q9m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.963636 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:32 crc kubenswrapper[4926]: I0312 18:30:32.963840 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.065314 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5q9m\" (UniqueName: \"kubernetes.io/projected/eab3d0c2-5edc-4657-928f-52a87de2293a-kube-api-access-k5q9m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.065540 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.065638 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.071288 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.078914 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.095252 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5q9m\" (UniqueName: \"kubernetes.io/projected/eab3d0c2-5edc-4657-928f-52a87de2293a-kube-api-access-k5q9m\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:33 crc kubenswrapper[4926]: I0312 18:30:33.166095 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:30:34 crc kubenswrapper[4926]: I0312 18:30:34.321102 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq"] Mar 12 18:30:34 crc kubenswrapper[4926]: I0312 18:30:34.769220 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" event={"ID":"eab3d0c2-5edc-4657-928f-52a87de2293a","Type":"ContainerStarted","Data":"b5e1946b94fd7ed87879292f5fef7a875c3edb29871fbcc32b4cd0930bf71f7f"} Mar 12 18:30:35 crc kubenswrapper[4926]: I0312 18:30:35.782931 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" event={"ID":"eab3d0c2-5edc-4657-928f-52a87de2293a","Type":"ContainerStarted","Data":"d773dc87aff248613f6b9d5825635f53e85aba8c66114657cd4d6a99eafa2f07"} Mar 12 18:30:35 crc kubenswrapper[4926]: I0312 18:30:35.813951 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" podStartSLOduration=3.149726207 podStartE2EDuration="3.813926245s" podCreationTimestamp="2026-03-12 18:30:32 +0000 UTC" firstStartedPulling="2026-03-12 18:30:34.323212945 +0000 UTC m=+1674.691839278" lastFinishedPulling="2026-03-12 18:30:34.987412973 +0000 UTC m=+1675.356039316" observedRunningTime="2026-03-12 18:30:35.8053592 +0000 UTC m=+1676.173985543" watchObservedRunningTime="2026-03-12 18:30:35.813926245 +0000 UTC m=+1676.182552588" Mar 12 18:30:56 crc kubenswrapper[4926]: I0312 18:30:56.818260 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:30:56 crc kubenswrapper[4926]: I0312 18:30:56.819087 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:31:15 crc kubenswrapper[4926]: I0312 18:31:15.428734 4926 scope.go:117] "RemoveContainer" containerID="7655ace271d6f3feff7f2f4eaa16c5ea625e5fe6beab0c4038907b7f5e715987" Mar 12 18:31:15 crc kubenswrapper[4926]: I0312 18:31:15.455164 4926 scope.go:117] "RemoveContainer" containerID="435ef2132df5554295889ec9ba808434afc58d13077a9a39e7222dd605149d55" Mar 12 18:31:15 crc kubenswrapper[4926]: I0312 18:31:15.479871 4926 scope.go:117] "RemoveContainer" containerID="89db61a8e375fc043c0de0eeea971f98c1fb8c205fa68c44c6f73887667808c6" Mar 12 18:31:15 crc kubenswrapper[4926]: I0312 18:31:15.501599 4926 scope.go:117] "RemoveContainer" containerID="d312e71f700c8194066a4bfb0efbe92c4e2b9fbced6805317853fff15c35d5d6" Mar 12 18:31:15 crc kubenswrapper[4926]: I0312 18:31:15.524294 4926 scope.go:117] "RemoveContainer" containerID="0b78f1753e8c4a22ed2f4147f06789b56f6db668b7a92626590701ea109bcd09" Mar 12 18:31:15 crc kubenswrapper[4926]: I0312 18:31:15.542320 4926 scope.go:117] "RemoveContainer" containerID="2acf5477ea08c1fb97c59765c73d7abfdadc6189fbd1cf4549d66c1007c1d817" Mar 12 18:31:26 crc kubenswrapper[4926]: I0312 18:31:26.817654 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:31:26 crc kubenswrapper[4926]: I0312 18:31:26.818350 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:31:42 crc kubenswrapper[4926]: I0312 18:31:42.545860 4926 generic.go:334] "Generic (PLEG): container finished" podID="eab3d0c2-5edc-4657-928f-52a87de2293a" containerID="d773dc87aff248613f6b9d5825635f53e85aba8c66114657cd4d6a99eafa2f07" exitCode=0 Mar 12 18:31:42 crc kubenswrapper[4926]: I0312 18:31:42.545979 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" event={"ID":"eab3d0c2-5edc-4657-928f-52a87de2293a","Type":"ContainerDied","Data":"d773dc87aff248613f6b9d5825635f53e85aba8c66114657cd4d6a99eafa2f07"} Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.060671 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.214557 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5q9m\" (UniqueName: \"kubernetes.io/projected/eab3d0c2-5edc-4657-928f-52a87de2293a-kube-api-access-k5q9m\") pod \"eab3d0c2-5edc-4657-928f-52a87de2293a\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.214617 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-ssh-key-openstack-edpm-ipam\") pod \"eab3d0c2-5edc-4657-928f-52a87de2293a\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.214798 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-inventory\") pod \"eab3d0c2-5edc-4657-928f-52a87de2293a\" (UID: \"eab3d0c2-5edc-4657-928f-52a87de2293a\") " Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.222152 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab3d0c2-5edc-4657-928f-52a87de2293a-kube-api-access-k5q9m" (OuterVolumeSpecName: "kube-api-access-k5q9m") pod "eab3d0c2-5edc-4657-928f-52a87de2293a" (UID: "eab3d0c2-5edc-4657-928f-52a87de2293a"). InnerVolumeSpecName "kube-api-access-k5q9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.243854 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-inventory" (OuterVolumeSpecName: "inventory") pod "eab3d0c2-5edc-4657-928f-52a87de2293a" (UID: "eab3d0c2-5edc-4657-928f-52a87de2293a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.270304 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eab3d0c2-5edc-4657-928f-52a87de2293a" (UID: "eab3d0c2-5edc-4657-928f-52a87de2293a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.317423 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5q9m\" (UniqueName: \"kubernetes.io/projected/eab3d0c2-5edc-4657-928f-52a87de2293a-kube-api-access-k5q9m\") on node \"crc\" DevicePath \"\"" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.317471 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.317486 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eab3d0c2-5edc-4657-928f-52a87de2293a-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.573965 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" event={"ID":"eab3d0c2-5edc-4657-928f-52a87de2293a","Type":"ContainerDied","Data":"b5e1946b94fd7ed87879292f5fef7a875c3edb29871fbcc32b4cd0930bf71f7f"} Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.574471 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e1946b94fd7ed87879292f5fef7a875c3edb29871fbcc32b4cd0930bf71f7f" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.574076 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.733082 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj"] Mar 12 18:31:44 crc kubenswrapper[4926]: E0312 18:31:44.733526 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab3d0c2-5edc-4657-928f-52a87de2293a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.733545 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab3d0c2-5edc-4657-928f-52a87de2293a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.733745 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab3d0c2-5edc-4657-928f-52a87de2293a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.734293 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.736324 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.736353 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.736756 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.741959 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.751797 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj"] Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.827250 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd4mv\" (UniqueName: \"kubernetes.io/projected/96a114de-73e3-4088-902e-c3d1fcaaa3ad-kube-api-access-wd4mv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.827295 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.827325 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.929294 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd4mv\" (UniqueName: \"kubernetes.io/projected/96a114de-73e3-4088-902e-c3d1fcaaa3ad-kube-api-access-wd4mv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.929373 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.929497 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.936569 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.937639 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:44 crc kubenswrapper[4926]: I0312 18:31:44.951142 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd4mv\" (UniqueName: \"kubernetes.io/projected/96a114de-73e3-4088-902e-c3d1fcaaa3ad-kube-api-access-wd4mv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-55zdj\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:45 crc kubenswrapper[4926]: I0312 18:31:45.054640 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:45 crc kubenswrapper[4926]: I0312 18:31:45.651016 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj"] Mar 12 18:31:46 crc kubenswrapper[4926]: I0312 18:31:46.601375 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" event={"ID":"96a114de-73e3-4088-902e-c3d1fcaaa3ad","Type":"ContainerStarted","Data":"710f1a12525865b4371846d75a84287b63e3823a526dd6bb16dfbf5e78d6ddda"} Mar 12 18:31:46 crc kubenswrapper[4926]: I0312 18:31:46.601853 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" event={"ID":"96a114de-73e3-4088-902e-c3d1fcaaa3ad","Type":"ContainerStarted","Data":"e75f8ec3b61a16b23223b1074cf55568f6ef14af0ee49e12c4225a40410f4cd0"} Mar 12 18:31:46 crc kubenswrapper[4926]: I0312 18:31:46.634359 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" podStartSLOduration=2.134787726 podStartE2EDuration="2.634340266s" podCreationTimestamp="2026-03-12 18:31:44 +0000 UTC" firstStartedPulling="2026-03-12 18:31:45.666642832 +0000 UTC m=+1746.035269175" lastFinishedPulling="2026-03-12 18:31:46.166195372 +0000 UTC m=+1746.534821715" observedRunningTime="2026-03-12 18:31:46.621848648 +0000 UTC m=+1746.990475001" watchObservedRunningTime="2026-03-12 18:31:46.634340266 +0000 UTC m=+1747.002966619" Mar 12 18:31:51 crc kubenswrapper[4926]: I0312 18:31:51.661208 4926 generic.go:334] "Generic (PLEG): container finished" podID="96a114de-73e3-4088-902e-c3d1fcaaa3ad" containerID="710f1a12525865b4371846d75a84287b63e3823a526dd6bb16dfbf5e78d6ddda" exitCode=0 Mar 12 18:31:51 crc kubenswrapper[4926]: I0312 18:31:51.661318 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" event={"ID":"96a114de-73e3-4088-902e-c3d1fcaaa3ad","Type":"ContainerDied","Data":"710f1a12525865b4371846d75a84287b63e3823a526dd6bb16dfbf5e78d6ddda"} Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.115222 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.205311 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-inventory\") pod \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.205508 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-ssh-key-openstack-edpm-ipam\") pod \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.205650 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd4mv\" (UniqueName: \"kubernetes.io/projected/96a114de-73e3-4088-902e-c3d1fcaaa3ad-kube-api-access-wd4mv\") pod \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\" (UID: \"96a114de-73e3-4088-902e-c3d1fcaaa3ad\") " Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.212571 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a114de-73e3-4088-902e-c3d1fcaaa3ad-kube-api-access-wd4mv" (OuterVolumeSpecName: "kube-api-access-wd4mv") pod "96a114de-73e3-4088-902e-c3d1fcaaa3ad" (UID: "96a114de-73e3-4088-902e-c3d1fcaaa3ad"). InnerVolumeSpecName "kube-api-access-wd4mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.235771 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "96a114de-73e3-4088-902e-c3d1fcaaa3ad" (UID: "96a114de-73e3-4088-902e-c3d1fcaaa3ad"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.244280 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-inventory" (OuterVolumeSpecName: "inventory") pod "96a114de-73e3-4088-902e-c3d1fcaaa3ad" (UID: "96a114de-73e3-4088-902e-c3d1fcaaa3ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.308837 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.308881 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/96a114de-73e3-4088-902e-c3d1fcaaa3ad-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.308894 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd4mv\" (UniqueName: \"kubernetes.io/projected/96a114de-73e3-4088-902e-c3d1fcaaa3ad-kube-api-access-wd4mv\") on node \"crc\" DevicePath \"\"" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.686057 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" event={"ID":"96a114de-73e3-4088-902e-c3d1fcaaa3ad","Type":"ContainerDied","Data":"e75f8ec3b61a16b23223b1074cf55568f6ef14af0ee49e12c4225a40410f4cd0"} Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.686356 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e75f8ec3b61a16b23223b1074cf55568f6ef14af0ee49e12c4225a40410f4cd0" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.686114 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-55zdj" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.766182 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr"] Mar 12 18:31:53 crc kubenswrapper[4926]: E0312 18:31:53.766843 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a114de-73e3-4088-902e-c3d1fcaaa3ad" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.766863 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a114de-73e3-4088-902e-c3d1fcaaa3ad" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.767047 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a114de-73e3-4088-902e-c3d1fcaaa3ad" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.767666 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.770914 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.771073 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.771344 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.773195 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.808247 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr"] Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.920578 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.920652 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:53 crc kubenswrapper[4926]: I0312 18:31:53.921253 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9cp\" (UniqueName: \"kubernetes.io/projected/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-kube-api-access-7r9cp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.023145 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9cp\" (UniqueName: \"kubernetes.io/projected/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-kube-api-access-7r9cp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.023350 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.023391 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.028279 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.031279 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.046628 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9cp\" (UniqueName: \"kubernetes.io/projected/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-kube-api-access-7r9cp\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-njwrr\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.122658 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:31:54 crc kubenswrapper[4926]: I0312 18:31:54.759256 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr"] Mar 12 18:31:55 crc kubenswrapper[4926]: I0312 18:31:55.715354 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" event={"ID":"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46","Type":"ContainerStarted","Data":"573a9ceb3cb256f42bb14e281728fe4976f6e1621ed33c7cafc9ec1b14a4b12b"} Mar 12 18:31:55 crc kubenswrapper[4926]: I0312 18:31:55.715749 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" event={"ID":"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46","Type":"ContainerStarted","Data":"2a98a7c2f2a3e2f97eddca16c9e2490f09751655edf87b4db97219d94185a80f"} Mar 12 18:31:55 crc kubenswrapper[4926]: I0312 18:31:55.738902 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" podStartSLOduration=2.339037015 podStartE2EDuration="2.73887545s" podCreationTimestamp="2026-03-12 18:31:53 +0000 UTC" firstStartedPulling="2026-03-12 18:31:54.747128702 +0000 UTC m=+1755.115755035" lastFinishedPulling="2026-03-12 18:31:55.146967127 +0000 UTC m=+1755.515593470" observedRunningTime="2026-03-12 18:31:55.733538565 +0000 UTC m=+1756.102164898" watchObservedRunningTime="2026-03-12 18:31:55.73887545 +0000 UTC m=+1756.107501813" Mar 12 18:31:56 crc kubenswrapper[4926]: I0312 18:31:56.817998 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:31:56 crc kubenswrapper[4926]: I0312 18:31:56.818374 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:31:56 crc kubenswrapper[4926]: I0312 18:31:56.818461 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:31:56 crc kubenswrapper[4926]: I0312 18:31:56.819576 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:31:56 crc kubenswrapper[4926]: I0312 18:31:56.819686 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" gracePeriod=600 Mar 12 18:31:56 crc kubenswrapper[4926]: E0312 18:31:56.950871 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:31:57 crc kubenswrapper[4926]: I0312 18:31:57.750119 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" exitCode=0 Mar 12 18:31:57 crc kubenswrapper[4926]: I0312 18:31:57.750189 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e"} Mar 12 18:31:57 crc kubenswrapper[4926]: I0312 18:31:57.750244 4926 scope.go:117] "RemoveContainer" containerID="759fd18072cdf8fcc7bc2d92cc950b5720a437d7e4487f5098fffd2244e21cde" Mar 12 18:31:57 crc kubenswrapper[4926]: I0312 18:31:57.751147 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:31:57 crc kubenswrapper[4926]: E0312 18:31:57.751760 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.131914 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555672-dq68h"] Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.133671 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.136046 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.136204 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.136352 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.148692 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555672-dq68h"] Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.245812 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hth86\" (UniqueName: \"kubernetes.io/projected/609b8dbc-517d-4483-b02a-d7445cd2aa2f-kube-api-access-hth86\") pod \"auto-csr-approver-29555672-dq68h\" (UID: \"609b8dbc-517d-4483-b02a-d7445cd2aa2f\") " pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.347788 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hth86\" (UniqueName: \"kubernetes.io/projected/609b8dbc-517d-4483-b02a-d7445cd2aa2f-kube-api-access-hth86\") pod \"auto-csr-approver-29555672-dq68h\" (UID: \"609b8dbc-517d-4483-b02a-d7445cd2aa2f\") " pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.372831 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hth86\" (UniqueName: \"kubernetes.io/projected/609b8dbc-517d-4483-b02a-d7445cd2aa2f-kube-api-access-hth86\") pod \"auto-csr-approver-29555672-dq68h\" (UID: \"609b8dbc-517d-4483-b02a-d7445cd2aa2f\") " pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.451058 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:00 crc kubenswrapper[4926]: I0312 18:32:00.948253 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555672-dq68h"] Mar 12 18:32:01 crc kubenswrapper[4926]: I0312 18:32:01.794677 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555672-dq68h" event={"ID":"609b8dbc-517d-4483-b02a-d7445cd2aa2f","Type":"ContainerStarted","Data":"7182f54670fbf2e0ae70b5529a32090424d21ecca05ea5da6c657dbbb51c12ca"} Mar 12 18:32:02 crc kubenswrapper[4926]: I0312 18:32:02.807520 4926 generic.go:334] "Generic (PLEG): container finished" podID="609b8dbc-517d-4483-b02a-d7445cd2aa2f" containerID="95bf3d49a01a19202e8eb63fbef45841e7cc791e3f3ccb3272c728981fc71d7d" exitCode=0 Mar 12 18:32:02 crc kubenswrapper[4926]: I0312 18:32:02.807628 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555672-dq68h" event={"ID":"609b8dbc-517d-4483-b02a-d7445cd2aa2f","Type":"ContainerDied","Data":"95bf3d49a01a19202e8eb63fbef45841e7cc791e3f3ccb3272c728981fc71d7d"} Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.180788 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.225676 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hth86\" (UniqueName: \"kubernetes.io/projected/609b8dbc-517d-4483-b02a-d7445cd2aa2f-kube-api-access-hth86\") pod \"609b8dbc-517d-4483-b02a-d7445cd2aa2f\" (UID: \"609b8dbc-517d-4483-b02a-d7445cd2aa2f\") " Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.231546 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609b8dbc-517d-4483-b02a-d7445cd2aa2f-kube-api-access-hth86" (OuterVolumeSpecName: "kube-api-access-hth86") pod "609b8dbc-517d-4483-b02a-d7445cd2aa2f" (UID: "609b8dbc-517d-4483-b02a-d7445cd2aa2f"). InnerVolumeSpecName "kube-api-access-hth86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.327723 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hth86\" (UniqueName: \"kubernetes.io/projected/609b8dbc-517d-4483-b02a-d7445cd2aa2f-kube-api-access-hth86\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.838642 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555672-dq68h" event={"ID":"609b8dbc-517d-4483-b02a-d7445cd2aa2f","Type":"ContainerDied","Data":"7182f54670fbf2e0ae70b5529a32090424d21ecca05ea5da6c657dbbb51c12ca"} Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.838699 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7182f54670fbf2e0ae70b5529a32090424d21ecca05ea5da6c657dbbb51c12ca" Mar 12 18:32:04 crc kubenswrapper[4926]: I0312 18:32:04.838707 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555672-dq68h" Mar 12 18:32:05 crc kubenswrapper[4926]: I0312 18:32:05.273260 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555666-x5zgg"] Mar 12 18:32:05 crc kubenswrapper[4926]: I0312 18:32:05.288328 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555666-x5zgg"] Mar 12 18:32:06 crc kubenswrapper[4926]: I0312 18:32:06.507145 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff186e2-5cbe-493a-b911-426e982888cb" path="/var/lib/kubelet/pods/5ff186e2-5cbe-493a-b911-426e982888cb/volumes" Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.044172 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-87jsw"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.053315 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gk626"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.065461 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-87jsw"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.074123 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kslsb"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.082635 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c596-account-create-update-kw2ql"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.109143 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-15a0-account-create-update-b4wdv"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.124767 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gk626"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.135654 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c596-account-create-update-kw2ql"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.146499 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kslsb"] Mar 12 18:32:07 crc kubenswrapper[4926]: I0312 18:32:07.155712 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-15a0-account-create-update-b4wdv"] Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.032493 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b886-account-create-update-sdqf6"] Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.040592 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b886-account-create-update-sdqf6"] Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.508428 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e559975-5aca-457e-8c50-465552595381" path="/var/lib/kubelet/pods/1e559975-5aca-457e-8c50-465552595381/volumes" Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.509216 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496" path="/var/lib/kubelet/pods/2e70e0f6-9596-4d7f-bbc6-e5b6e3f6c496/volumes" Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.509974 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f98d642-e5f2-44de-9259-15b5eed6b80c" path="/var/lib/kubelet/pods/3f98d642-e5f2-44de-9259-15b5eed6b80c/volumes" Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.510631 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51bf320d-e5c9-43c6-baf7-e1f2f9ee3313" path="/var/lib/kubelet/pods/51bf320d-e5c9-43c6-baf7-e1f2f9ee3313/volumes" Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.511838 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8493fd-3e35-41fb-8daa-febd2238ce1b" path="/var/lib/kubelet/pods/ce8493fd-3e35-41fb-8daa-febd2238ce1b/volumes" Mar 12 18:32:08 crc kubenswrapper[4926]: I0312 18:32:08.512498 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8a8892-82ee-4502-b76d-ca289485809b" path="/var/lib/kubelet/pods/eb8a8892-82ee-4502-b76d-ca289485809b/volumes" Mar 12 18:32:12 crc kubenswrapper[4926]: I0312 18:32:12.490467 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:32:12 crc kubenswrapper[4926]: E0312 18:32:12.491121 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.609037 4926 scope.go:117] "RemoveContainer" containerID="303e0f37b5ff32adc0c5ce3a0b0f3252de8de58704f569f8ef9bdf6d235b21d1" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.680726 4926 scope.go:117] "RemoveContainer" containerID="ed6d1abff4f7da34f442e943b17a81aeae55a22054d604db8a2f0c813a7bb890" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.711503 4926 scope.go:117] "RemoveContainer" containerID="97dffad6ad390315746d80bb46b2150bb89f6d5359806877446913c809356199" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.757397 4926 scope.go:117] "RemoveContainer" containerID="0477440cc1caee314ece47874cba75dc234908b5f0302ff1737f87d526dec4a4" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.817323 4926 scope.go:117] "RemoveContainer" containerID="2d6e21accb5b365373d4e446af9a0d2f3bb681b5f74d1a1d1d5bbeb21b98d345" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.846766 4926 scope.go:117] "RemoveContainer" containerID="00d08df89cba0281fd578a80d34eeb606178fe179a18e0bb4dca41d34fcaae6c" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.875998 4926 scope.go:117] "RemoveContainer" containerID="7093c3070454c81ffcfc913032818ad12b03f50c370515a66252d6a038735742" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.897572 4926 scope.go:117] "RemoveContainer" containerID="e6bde295c7812497b3848b236f4e958ec3015f8ebeca1a4fdbd6ee1413805cb9" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.941096 4926 scope.go:117] "RemoveContainer" containerID="2ba173648181598e4717e6ee2a92ecb7ee6351d447d5a38a4302f5af077322b5" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.965431 4926 scope.go:117] "RemoveContainer" containerID="22a2d1e8587226f797d924ae278b1c0e7b846e0cc05d9f335cbb31ac04af937c" Mar 12 18:32:15 crc kubenswrapper[4926]: I0312 18:32:15.984288 4926 scope.go:117] "RemoveContainer" containerID="d5aba66a32fd03fd9f7cad091c15aa0c8e52be10f9b3914b0869cd5e59465cde" Mar 12 18:32:25 crc kubenswrapper[4926]: I0312 18:32:25.489736 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:32:25 crc kubenswrapper[4926]: E0312 18:32:25.490503 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:32:31 crc kubenswrapper[4926]: I0312 18:32:31.044676 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jlt78"] Mar 12 18:32:31 crc kubenswrapper[4926]: I0312 18:32:31.054558 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jlt78"] Mar 12 18:32:31 crc kubenswrapper[4926]: I0312 18:32:31.125721 4926 generic.go:334] "Generic (PLEG): container finished" podID="7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" containerID="573a9ceb3cb256f42bb14e281728fe4976f6e1621ed33c7cafc9ec1b14a4b12b" exitCode=0 Mar 12 18:32:31 crc kubenswrapper[4926]: I0312 18:32:31.125786 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" event={"ID":"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46","Type":"ContainerDied","Data":"573a9ceb3cb256f42bb14e281728fe4976f6e1621ed33c7cafc9ec1b14a4b12b"} Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.509804 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349ca4f5-349b-45ab-98a4-844fa00599d0" path="/var/lib/kubelet/pods/349ca4f5-349b-45ab-98a4-844fa00599d0/volumes" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.601753 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.666995 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-inventory\") pod \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.668403 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-ssh-key-openstack-edpm-ipam\") pod \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.668533 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r9cp\" (UniqueName: \"kubernetes.io/projected/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-kube-api-access-7r9cp\") pod \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\" (UID: \"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46\") " Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.679850 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-kube-api-access-7r9cp" (OuterVolumeSpecName: "kube-api-access-7r9cp") pod "7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" (UID: "7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46"). InnerVolumeSpecName "kube-api-access-7r9cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.707554 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" (UID: "7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.707609 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-inventory" (OuterVolumeSpecName: "inventory") pod "7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" (UID: "7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.781404 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r9cp\" (UniqueName: \"kubernetes.io/projected/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-kube-api-access-7r9cp\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.781740 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:32 crc kubenswrapper[4926]: I0312 18:32:32.781775 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.144744 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" event={"ID":"7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46","Type":"ContainerDied","Data":"2a98a7c2f2a3e2f97eddca16c9e2490f09751655edf87b4db97219d94185a80f"} Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.144849 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a98a7c2f2a3e2f97eddca16c9e2490f09751655edf87b4db97219d94185a80f" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.144779 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-njwrr" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.239527 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w"] Mar 12 18:32:33 crc kubenswrapper[4926]: E0312 18:32:33.240172 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609b8dbc-517d-4483-b02a-d7445cd2aa2f" containerName="oc" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.240201 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="609b8dbc-517d-4483-b02a-d7445cd2aa2f" containerName="oc" Mar 12 18:32:33 crc kubenswrapper[4926]: E0312 18:32:33.240241 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.240255 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.240639 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="609b8dbc-517d-4483-b02a-d7445cd2aa2f" containerName="oc" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.240690 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.241648 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.244927 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.245155 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.245878 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.246289 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.254473 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w"] Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.394703 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.394799 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.394938 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxv7d\" (UniqueName: \"kubernetes.io/projected/1891d243-9edd-48aa-88ff-f943dc337e8d-kube-api-access-bxv7d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.496743 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.496892 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.497109 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxv7d\" (UniqueName: \"kubernetes.io/projected/1891d243-9edd-48aa-88ff-f943dc337e8d-kube-api-access-bxv7d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.500686 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.506662 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.516373 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxv7d\" (UniqueName: \"kubernetes.io/projected/1891d243-9edd-48aa-88ff-f943dc337e8d-kube-api-access-bxv7d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:33 crc kubenswrapper[4926]: I0312 18:32:33.564819 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.035197 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-40f3-account-create-update-6f5kz"] Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.044728 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-n9jj2"] Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.057051 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-40f3-account-create-update-6f5kz"] Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.069320 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-n9jj2"] Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.180596 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w"] Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.503161 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1a6049-a924-4e9e-adad-e6ec84732eb9" path="/var/lib/kubelet/pods/6a1a6049-a924-4e9e-adad-e6ec84732eb9/volumes" Mar 12 18:32:34 crc kubenswrapper[4926]: I0312 18:32:34.504211 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa60a9bd-1933-44b8-ac17-be1f72c9da68" path="/var/lib/kubelet/pods/fa60a9bd-1933-44b8-ac17-be1f72c9da68/volumes" Mar 12 18:32:35 crc kubenswrapper[4926]: I0312 18:32:35.166094 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" event={"ID":"1891d243-9edd-48aa-88ff-f943dc337e8d","Type":"ContainerStarted","Data":"7548c9cec626747350a0147b1ed113cbc39ffe794cdd24b52386a11968ee94db"} Mar 12 18:32:35 crc kubenswrapper[4926]: I0312 18:32:35.166636 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" event={"ID":"1891d243-9edd-48aa-88ff-f943dc337e8d","Type":"ContainerStarted","Data":"3b48a2d4c059ac3596463fa9774e80dda754bd59c1f5280cd14ce2ca34cfd373"} Mar 12 18:32:35 crc kubenswrapper[4926]: I0312 18:32:35.207774 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" podStartSLOduration=1.699335222 podStartE2EDuration="2.207755986s" podCreationTimestamp="2026-03-12 18:32:33 +0000 UTC" firstStartedPulling="2026-03-12 18:32:34.193711885 +0000 UTC m=+1794.562338218" lastFinishedPulling="2026-03-12 18:32:34.702132649 +0000 UTC m=+1795.070758982" observedRunningTime="2026-03-12 18:32:35.207106896 +0000 UTC m=+1795.575733229" watchObservedRunningTime="2026-03-12 18:32:35.207755986 +0000 UTC m=+1795.576382319" Mar 12 18:32:36 crc kubenswrapper[4926]: I0312 18:32:36.489954 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:32:36 crc kubenswrapper[4926]: E0312 18:32:36.490676 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:32:37 crc kubenswrapper[4926]: I0312 18:32:37.029780 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2c76-account-create-update-wvgzd"] Mar 12 18:32:37 crc kubenswrapper[4926]: I0312 18:32:37.042472 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2c76-account-create-update-wvgzd"] Mar 12 18:32:37 crc kubenswrapper[4926]: I0312 18:32:37.053166 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-n28hs"] Mar 12 18:32:37 crc kubenswrapper[4926]: I0312 18:32:37.063578 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-n28hs"] Mar 12 18:32:37 crc kubenswrapper[4926]: I0312 18:32:37.071585 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-bf50-account-create-update-ckzh6"] Mar 12 18:32:37 crc kubenswrapper[4926]: I0312 18:32:37.078921 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-bf50-account-create-update-ckzh6"] Mar 12 18:32:38 crc kubenswrapper[4926]: I0312 18:32:38.033397 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qr6zk"] Mar 12 18:32:38 crc kubenswrapper[4926]: I0312 18:32:38.043947 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qr6zk"] Mar 12 18:32:38 crc kubenswrapper[4926]: I0312 18:32:38.510646 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05959443-d099-4653-9736-7745ba1ce331" path="/var/lib/kubelet/pods/05959443-d099-4653-9736-7745ba1ce331/volumes" Mar 12 18:32:38 crc kubenswrapper[4926]: I0312 18:32:38.512078 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5bf33c-5cb6-4b29-95f4-fcf47183be58" path="/var/lib/kubelet/pods/9e5bf33c-5cb6-4b29-95f4-fcf47183be58/volumes" Mar 12 18:32:38 crc kubenswrapper[4926]: I0312 18:32:38.513212 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea314c69-7524-43d6-9d4f-9fdb16510952" path="/var/lib/kubelet/pods/ea314c69-7524-43d6-9d4f-9fdb16510952/volumes" Mar 12 18:32:38 crc kubenswrapper[4926]: I0312 18:32:38.514342 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7917312-5922-44df-a838-6b5452e8bb84" path="/var/lib/kubelet/pods/f7917312-5922-44df-a838-6b5452e8bb84/volumes" Mar 12 18:32:39 crc kubenswrapper[4926]: I0312 18:32:39.032841 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-72msf"] Mar 12 18:32:39 crc kubenswrapper[4926]: I0312 18:32:39.045040 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-72msf"] Mar 12 18:32:39 crc kubenswrapper[4926]: I0312 18:32:39.206706 4926 generic.go:334] "Generic (PLEG): container finished" podID="1891d243-9edd-48aa-88ff-f943dc337e8d" containerID="7548c9cec626747350a0147b1ed113cbc39ffe794cdd24b52386a11968ee94db" exitCode=0 Mar 12 18:32:39 crc kubenswrapper[4926]: I0312 18:32:39.206776 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" event={"ID":"1891d243-9edd-48aa-88ff-f943dc337e8d","Type":"ContainerDied","Data":"7548c9cec626747350a0147b1ed113cbc39ffe794cdd24b52386a11968ee94db"} Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.512988 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bad0b5a-a817-45ec-9ebb-7b30d7492ed8" path="/var/lib/kubelet/pods/6bad0b5a-a817-45ec-9ebb-7b30d7492ed8/volumes" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.760172 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.877576 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-inventory\") pod \"1891d243-9edd-48aa-88ff-f943dc337e8d\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.877673 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-ssh-key-openstack-edpm-ipam\") pod \"1891d243-9edd-48aa-88ff-f943dc337e8d\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.877801 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxv7d\" (UniqueName: \"kubernetes.io/projected/1891d243-9edd-48aa-88ff-f943dc337e8d-kube-api-access-bxv7d\") pod \"1891d243-9edd-48aa-88ff-f943dc337e8d\" (UID: \"1891d243-9edd-48aa-88ff-f943dc337e8d\") " Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.885868 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1891d243-9edd-48aa-88ff-f943dc337e8d-kube-api-access-bxv7d" (OuterVolumeSpecName: "kube-api-access-bxv7d") pod "1891d243-9edd-48aa-88ff-f943dc337e8d" (UID: "1891d243-9edd-48aa-88ff-f943dc337e8d"). InnerVolumeSpecName "kube-api-access-bxv7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.911029 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1891d243-9edd-48aa-88ff-f943dc337e8d" (UID: "1891d243-9edd-48aa-88ff-f943dc337e8d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.922636 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-inventory" (OuterVolumeSpecName: "inventory") pod "1891d243-9edd-48aa-88ff-f943dc337e8d" (UID: "1891d243-9edd-48aa-88ff-f943dc337e8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.980524 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxv7d\" (UniqueName: \"kubernetes.io/projected/1891d243-9edd-48aa-88ff-f943dc337e8d-kube-api-access-bxv7d\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.980560 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:40 crc kubenswrapper[4926]: I0312 18:32:40.980569 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1891d243-9edd-48aa-88ff-f943dc337e8d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.238672 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" event={"ID":"1891d243-9edd-48aa-88ff-f943dc337e8d","Type":"ContainerDied","Data":"3b48a2d4c059ac3596463fa9774e80dda754bd59c1f5280cd14ce2ca34cfd373"} Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.238713 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b48a2d4c059ac3596463fa9774e80dda754bd59c1f5280cd14ce2ca34cfd373" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.238782 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.334542 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679"] Mar 12 18:32:41 crc kubenswrapper[4926]: E0312 18:32:41.335263 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1891d243-9edd-48aa-88ff-f943dc337e8d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.335293 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="1891d243-9edd-48aa-88ff-f943dc337e8d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.335548 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="1891d243-9edd-48aa-88ff-f943dc337e8d" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.336475 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.339078 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.339190 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.343015 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.343212 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.354860 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679"] Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.489492 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.489996 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ks6v\" (UniqueName: \"kubernetes.io/projected/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-kube-api-access-6ks6v\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.490093 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.591765 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.591872 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.591962 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ks6v\" (UniqueName: \"kubernetes.io/projected/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-kube-api-access-6ks6v\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.598327 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.610578 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.612234 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ks6v\" (UniqueName: \"kubernetes.io/projected/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-kube-api-access-6ks6v\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-2d679\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:41 crc kubenswrapper[4926]: I0312 18:32:41.662660 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:32:42 crc kubenswrapper[4926]: I0312 18:32:42.230887 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679"] Mar 12 18:32:42 crc kubenswrapper[4926]: I0312 18:32:42.236215 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:32:42 crc kubenswrapper[4926]: I0312 18:32:42.257649 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" event={"ID":"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3","Type":"ContainerStarted","Data":"29b92aea004582f0f351798351f79cbd679e14449e74d5553f917bee426e56d9"} Mar 12 18:32:43 crc kubenswrapper[4926]: I0312 18:32:43.054848 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-4vtkq"] Mar 12 18:32:43 crc kubenswrapper[4926]: I0312 18:32:43.071870 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-4vtkq"] Mar 12 18:32:43 crc kubenswrapper[4926]: I0312 18:32:43.272654 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" event={"ID":"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3","Type":"ContainerStarted","Data":"3908783110cb82ca0bb6465283312cde6f24a0f3397f93570aca5ad317b5a1da"} Mar 12 18:32:43 crc kubenswrapper[4926]: I0312 18:32:43.300856 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" podStartSLOduration=1.8736497349999999 podStartE2EDuration="2.300819548s" podCreationTimestamp="2026-03-12 18:32:41 +0000 UTC" firstStartedPulling="2026-03-12 18:32:42.236005192 +0000 UTC m=+1802.604631515" lastFinishedPulling="2026-03-12 18:32:42.663175005 +0000 UTC m=+1803.031801328" observedRunningTime="2026-03-12 18:32:43.293477781 +0000 UTC m=+1803.662104134" watchObservedRunningTime="2026-03-12 18:32:43.300819548 +0000 UTC m=+1803.669445911" Mar 12 18:32:44 crc kubenswrapper[4926]: I0312 18:32:44.509222 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b6b9d3-0c3c-4dcc-b417-49d53269c39d" path="/var/lib/kubelet/pods/d7b6b9d3-0c3c-4dcc-b417-49d53269c39d/volumes" Mar 12 18:32:48 crc kubenswrapper[4926]: I0312 18:32:48.490693 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:32:48 crc kubenswrapper[4926]: E0312 18:32:48.491460 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:32:59 crc kubenswrapper[4926]: I0312 18:32:59.489835 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:32:59 crc kubenswrapper[4926]: E0312 18:32:59.490497 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:33:11 crc kubenswrapper[4926]: I0312 18:33:11.044364 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vc5cr"] Mar 12 18:33:11 crc kubenswrapper[4926]: I0312 18:33:11.051425 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vc5cr"] Mar 12 18:33:12 crc kubenswrapper[4926]: I0312 18:33:12.490780 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:33:12 crc kubenswrapper[4926]: E0312 18:33:12.491183 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:33:12 crc kubenswrapper[4926]: I0312 18:33:12.507389 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b" path="/var/lib/kubelet/pods/e1853a6c-a8bb-4d95-b02e-5b708c9d1b2b/volumes" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.162087 4926 scope.go:117] "RemoveContainer" containerID="d40939b357197123b7378525a59c3f91e7240cb30140873883a85a0672ad25de" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.222652 4926 scope.go:117] "RemoveContainer" containerID="f8a8e8e5fcda402042df2295b1c2cbd321aa245d9b7be946af5559093b56242a" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.273203 4926 scope.go:117] "RemoveContainer" containerID="7a2cc30c7cfe8ca6acf3b5029da578678f6497bf01186f5ff5a4caf50683498c" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.300485 4926 scope.go:117] "RemoveContainer" containerID="f7b35eaf50020882859b6b648c035a75d02c47868f155d967f93cf393c62f39e" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.355569 4926 scope.go:117] "RemoveContainer" containerID="1a95a2291b71efd71b4d69e1bff55bb10de0a0cbc11d599a6fbdc537cf2a1c21" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.405399 4926 scope.go:117] "RemoveContainer" containerID="7fd785662fcf1b6a3363e17fd1fb61ae49e121593b0c0273db1785eefe4c3db8" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.470732 4926 scope.go:117] "RemoveContainer" containerID="c3e155687eb67c8841a428e422ae982d9a3f9a577c50dd1606a11fd1ad93242c" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.493866 4926 scope.go:117] "RemoveContainer" containerID="ff84043d3f919db7adaecbb9d5f2f9199eabe47965d6033ec791be724e00a8f2" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.523521 4926 scope.go:117] "RemoveContainer" containerID="12643d715a7a25afe43a9b8c6f2f23a51e1343ba5e8cfc3caebf03243d5ac109" Mar 12 18:33:16 crc kubenswrapper[4926]: I0312 18:33:16.553729 4926 scope.go:117] "RemoveContainer" containerID="c4054fe5394aadede666252204c9b503dc03ddd88691e230f2deda1936fe7bc5" Mar 12 18:33:18 crc kubenswrapper[4926]: I0312 18:33:18.035332 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-txt96"] Mar 12 18:33:18 crc kubenswrapper[4926]: I0312 18:33:18.045292 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-txt96"] Mar 12 18:33:18 crc kubenswrapper[4926]: I0312 18:33:18.502925 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0f2830-bf50-4195-9dad-d4d2c9529ee9" path="/var/lib/kubelet/pods/7a0f2830-bf50-4195-9dad-d4d2c9529ee9/volumes" Mar 12 18:33:24 crc kubenswrapper[4926]: I0312 18:33:24.490566 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:33:24 crc kubenswrapper[4926]: E0312 18:33:24.491300 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:33:25 crc kubenswrapper[4926]: I0312 18:33:25.035155 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-sgnbr"] Mar 12 18:33:25 crc kubenswrapper[4926]: I0312 18:33:25.044943 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-98lfj"] Mar 12 18:33:25 crc kubenswrapper[4926]: I0312 18:33:25.057020 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-sgnbr"] Mar 12 18:33:25 crc kubenswrapper[4926]: I0312 18:33:25.068062 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-98lfj"] Mar 12 18:33:26 crc kubenswrapper[4926]: I0312 18:33:26.501297 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5704dd-cd13-4e5f-a77b-01266c63eeba" path="/var/lib/kubelet/pods/af5704dd-cd13-4e5f-a77b-01266c63eeba/volumes" Mar 12 18:33:26 crc kubenswrapper[4926]: I0312 18:33:26.502708 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96fcb3d-2f9f-468d-bafa-060a9d1f1af6" path="/var/lib/kubelet/pods/e96fcb3d-2f9f-468d-bafa-060a9d1f1af6/volumes" Mar 12 18:33:30 crc kubenswrapper[4926]: I0312 18:33:30.774650 4926 generic.go:334] "Generic (PLEG): container finished" podID="2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" containerID="3908783110cb82ca0bb6465283312cde6f24a0f3397f93570aca5ad317b5a1da" exitCode=0 Mar 12 18:33:30 crc kubenswrapper[4926]: I0312 18:33:30.774731 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" event={"ID":"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3","Type":"ContainerDied","Data":"3908783110cb82ca0bb6465283312cde6f24a0f3397f93570aca5ad317b5a1da"} Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.202473 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.303110 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-inventory\") pod \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.303240 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-ssh-key-openstack-edpm-ipam\") pod \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.303461 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ks6v\" (UniqueName: \"kubernetes.io/projected/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-kube-api-access-6ks6v\") pod \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\" (UID: \"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3\") " Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.322813 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-kube-api-access-6ks6v" (OuterVolumeSpecName: "kube-api-access-6ks6v") pod "2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" (UID: "2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3"). InnerVolumeSpecName "kube-api-access-6ks6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.341128 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" (UID: "2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.346657 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-inventory" (OuterVolumeSpecName: "inventory") pod "2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" (UID: "2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.405332 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ks6v\" (UniqueName: \"kubernetes.io/projected/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-kube-api-access-6ks6v\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.405376 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.405389 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.808914 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" event={"ID":"2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3","Type":"ContainerDied","Data":"29b92aea004582f0f351798351f79cbd679e14449e74d5553f917bee426e56d9"} Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.809318 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29b92aea004582f0f351798351f79cbd679e14449e74d5553f917bee426e56d9" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.809401 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-2d679" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.885510 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hckzq"] Mar 12 18:33:32 crc kubenswrapper[4926]: E0312 18:33:32.885885 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.885904 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.886083 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.886746 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.889043 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.889076 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.889683 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.890203 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:33:32 crc kubenswrapper[4926]: I0312 18:33:32.907905 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hckzq"] Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.018398 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.018473 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgwg\" (UniqueName: \"kubernetes.io/projected/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-kube-api-access-jrgwg\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.018517 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.121523 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.121605 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgwg\" (UniqueName: \"kubernetes.io/projected/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-kube-api-access-jrgwg\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.121658 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.125076 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.127302 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.139897 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgwg\" (UniqueName: \"kubernetes.io/projected/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-kube-api-access-jrgwg\") pod \"ssh-known-hosts-edpm-deployment-hckzq\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.202405 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.755783 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hckzq"] Mar 12 18:33:33 crc kubenswrapper[4926]: I0312 18:33:33.818302 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" event={"ID":"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde","Type":"ContainerStarted","Data":"b85405818faa450f3f66fb64306238ecf32d8922ec0d96eecb12de32e6dcabc3"} Mar 12 18:33:34 crc kubenswrapper[4926]: I0312 18:33:34.828574 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" event={"ID":"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde","Type":"ContainerStarted","Data":"201651c1e9fe821527e37d1cb405f0d130718068782330f1ef3a8604311947b9"} Mar 12 18:33:34 crc kubenswrapper[4926]: I0312 18:33:34.856915 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" podStartSLOduration=2.408289462 podStartE2EDuration="2.85689196s" podCreationTimestamp="2026-03-12 18:33:32 +0000 UTC" firstStartedPulling="2026-03-12 18:33:33.763907509 +0000 UTC m=+1854.132533882" lastFinishedPulling="2026-03-12 18:33:34.212510037 +0000 UTC m=+1854.581136380" observedRunningTime="2026-03-12 18:33:34.846075714 +0000 UTC m=+1855.214702057" watchObservedRunningTime="2026-03-12 18:33:34.85689196 +0000 UTC m=+1855.225518303" Mar 12 18:33:35 crc kubenswrapper[4926]: I0312 18:33:35.496666 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:33:35 crc kubenswrapper[4926]: E0312 18:33:35.496983 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:33:38 crc kubenswrapper[4926]: I0312 18:33:38.044381 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lgvzs"] Mar 12 18:33:38 crc kubenswrapper[4926]: I0312 18:33:38.057369 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lgvzs"] Mar 12 18:33:38 crc kubenswrapper[4926]: I0312 18:33:38.501889 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac4b5d6-fb31-4955-8679-db9d3ff63c10" path="/var/lib/kubelet/pods/dac4b5d6-fb31-4955-8679-db9d3ff63c10/volumes" Mar 12 18:33:41 crc kubenswrapper[4926]: I0312 18:33:41.907031 4926 generic.go:334] "Generic (PLEG): container finished" podID="d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" containerID="201651c1e9fe821527e37d1cb405f0d130718068782330f1ef3a8604311947b9" exitCode=0 Mar 12 18:33:41 crc kubenswrapper[4926]: I0312 18:33:41.907084 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" event={"ID":"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde","Type":"ContainerDied","Data":"201651c1e9fe821527e37d1cb405f0d130718068782330f1ef3a8604311947b9"} Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.417649 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.540463 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-ssh-key-openstack-edpm-ipam\") pod \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.540605 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrgwg\" (UniqueName: \"kubernetes.io/projected/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-kube-api-access-jrgwg\") pod \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.541166 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-inventory-0\") pod \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\" (UID: \"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde\") " Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.551718 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-kube-api-access-jrgwg" (OuterVolumeSpecName: "kube-api-access-jrgwg") pod "d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" (UID: "d48eb883-81b1-4ad6-bbbe-f9c5a9779fde"). InnerVolumeSpecName "kube-api-access-jrgwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.573718 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" (UID: "d48eb883-81b1-4ad6-bbbe-f9c5a9779fde"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.591340 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" (UID: "d48eb883-81b1-4ad6-bbbe-f9c5a9779fde"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.644884 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.644925 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrgwg\" (UniqueName: \"kubernetes.io/projected/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-kube-api-access-jrgwg\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.644939 4926 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d48eb883-81b1-4ad6-bbbe-f9c5a9779fde-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.926813 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" event={"ID":"d48eb883-81b1-4ad6-bbbe-f9c5a9779fde","Type":"ContainerDied","Data":"b85405818faa450f3f66fb64306238ecf32d8922ec0d96eecb12de32e6dcabc3"} Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.926865 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b85405818faa450f3f66fb64306238ecf32d8922ec0d96eecb12de32e6dcabc3" Mar 12 18:33:43 crc kubenswrapper[4926]: I0312 18:33:43.926929 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hckzq" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.030953 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll"] Mar 12 18:33:44 crc kubenswrapper[4926]: E0312 18:33:44.031836 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" containerName="ssh-known-hosts-edpm-deployment" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.031872 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" containerName="ssh-known-hosts-edpm-deployment" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.032336 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48eb883-81b1-4ad6-bbbe-f9c5a9779fde" containerName="ssh-known-hosts-edpm-deployment" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.033133 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.035133 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.035459 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.036141 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.049357 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll"] Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.052264 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.054327 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.054684 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6hd\" (UniqueName: \"kubernetes.io/projected/f12fab7b-7f88-441e-b230-551dc9ffa270-kube-api-access-np6hd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.054793 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.155634 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.155798 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6hd\" (UniqueName: \"kubernetes.io/projected/f12fab7b-7f88-441e-b230-551dc9ffa270-kube-api-access-np6hd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.155839 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.166655 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.166891 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.175304 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6hd\" (UniqueName: \"kubernetes.io/projected/f12fab7b-7f88-441e-b230-551dc9ffa270-kube-api-access-np6hd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zxwll\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.357223 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.891154 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll"] Mar 12 18:33:44 crc kubenswrapper[4926]: I0312 18:33:44.936529 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" event={"ID":"f12fab7b-7f88-441e-b230-551dc9ffa270","Type":"ContainerStarted","Data":"2a4b919a20dc2bec583121219cdf56f1fee5a07381e675a44dfaa71d5cdaf38c"} Mar 12 18:33:45 crc kubenswrapper[4926]: I0312 18:33:45.961851 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" event={"ID":"f12fab7b-7f88-441e-b230-551dc9ffa270","Type":"ContainerStarted","Data":"3eace6f424559836667819de65864f24f87afd1b5a74cbc8cf1b2289d0902d1f"} Mar 12 18:33:45 crc kubenswrapper[4926]: I0312 18:33:45.978667 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" podStartSLOduration=1.593961612 podStartE2EDuration="1.978645828s" podCreationTimestamp="2026-03-12 18:33:44 +0000 UTC" firstStartedPulling="2026-03-12 18:33:44.891205469 +0000 UTC m=+1865.259831802" lastFinishedPulling="2026-03-12 18:33:45.275889675 +0000 UTC m=+1865.644516018" observedRunningTime="2026-03-12 18:33:45.977358229 +0000 UTC m=+1866.345984602" watchObservedRunningTime="2026-03-12 18:33:45.978645828 +0000 UTC m=+1866.347272161" Mar 12 18:33:47 crc kubenswrapper[4926]: I0312 18:33:47.490322 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:33:47 crc kubenswrapper[4926]: E0312 18:33:47.490697 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:33:53 crc kubenswrapper[4926]: I0312 18:33:53.032276 4926 generic.go:334] "Generic (PLEG): container finished" podID="f12fab7b-7f88-441e-b230-551dc9ffa270" containerID="3eace6f424559836667819de65864f24f87afd1b5a74cbc8cf1b2289d0902d1f" exitCode=0 Mar 12 18:33:53 crc kubenswrapper[4926]: I0312 18:33:53.032383 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" event={"ID":"f12fab7b-7f88-441e-b230-551dc9ffa270","Type":"ContainerDied","Data":"3eace6f424559836667819de65864f24f87afd1b5a74cbc8cf1b2289d0902d1f"} Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.519198 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.659632 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-inventory\") pod \"f12fab7b-7f88-441e-b230-551dc9ffa270\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.660126 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-ssh-key-openstack-edpm-ipam\") pod \"f12fab7b-7f88-441e-b230-551dc9ffa270\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.660258 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np6hd\" (UniqueName: \"kubernetes.io/projected/f12fab7b-7f88-441e-b230-551dc9ffa270-kube-api-access-np6hd\") pod \"f12fab7b-7f88-441e-b230-551dc9ffa270\" (UID: \"f12fab7b-7f88-441e-b230-551dc9ffa270\") " Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.667210 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12fab7b-7f88-441e-b230-551dc9ffa270-kube-api-access-np6hd" (OuterVolumeSpecName: "kube-api-access-np6hd") pod "f12fab7b-7f88-441e-b230-551dc9ffa270" (UID: "f12fab7b-7f88-441e-b230-551dc9ffa270"). InnerVolumeSpecName "kube-api-access-np6hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.691085 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-inventory" (OuterVolumeSpecName: "inventory") pod "f12fab7b-7f88-441e-b230-551dc9ffa270" (UID: "f12fab7b-7f88-441e-b230-551dc9ffa270"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.703577 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f12fab7b-7f88-441e-b230-551dc9ffa270" (UID: "f12fab7b-7f88-441e-b230-551dc9ffa270"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.764650 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.764797 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f12fab7b-7f88-441e-b230-551dc9ffa270-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:54 crc kubenswrapper[4926]: I0312 18:33:54.764899 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np6hd\" (UniqueName: \"kubernetes.io/projected/f12fab7b-7f88-441e-b230-551dc9ffa270-kube-api-access-np6hd\") on node \"crc\" DevicePath \"\"" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.056040 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" event={"ID":"f12fab7b-7f88-441e-b230-551dc9ffa270","Type":"ContainerDied","Data":"2a4b919a20dc2bec583121219cdf56f1fee5a07381e675a44dfaa71d5cdaf38c"} Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.056098 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a4b919a20dc2bec583121219cdf56f1fee5a07381e675a44dfaa71d5cdaf38c" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.056430 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zxwll" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.149733 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc"] Mar 12 18:33:55 crc kubenswrapper[4926]: E0312 18:33:55.150523 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12fab7b-7f88-441e-b230-551dc9ffa270" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.150612 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12fab7b-7f88-441e-b230-551dc9ffa270" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.150893 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12fab7b-7f88-441e-b230-551dc9ffa270" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.151790 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.154921 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.157368 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.157638 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.157774 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-g65rs" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.168667 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc"] Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.274775 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8msml\" (UniqueName: \"kubernetes.io/projected/63c9d413-012b-47aa-a519-113685eb478c-kube-api-access-8msml\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.275359 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.275424 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.377630 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.377680 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.377766 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8msml\" (UniqueName: \"kubernetes.io/projected/63c9d413-012b-47aa-a519-113685eb478c-kube-api-access-8msml\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.387212 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.387222 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.398327 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8msml\" (UniqueName: \"kubernetes.io/projected/63c9d413-012b-47aa-a519-113685eb478c-kube-api-access-8msml\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:55 crc kubenswrapper[4926]: I0312 18:33:55.483988 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:33:56 crc kubenswrapper[4926]: W0312 18:33:56.026659 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c9d413_012b_47aa_a519_113685eb478c.slice/crio-652a64dc7fb64ba5f266c774add42dc7aede4a98794574de6c44fc9a65426637 WatchSource:0}: Error finding container 652a64dc7fb64ba5f266c774add42dc7aede4a98794574de6c44fc9a65426637: Status 404 returned error can't find the container with id 652a64dc7fb64ba5f266c774add42dc7aede4a98794574de6c44fc9a65426637 Mar 12 18:33:56 crc kubenswrapper[4926]: I0312 18:33:56.030069 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc"] Mar 12 18:33:56 crc kubenswrapper[4926]: I0312 18:33:56.067365 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" event={"ID":"63c9d413-012b-47aa-a519-113685eb478c","Type":"ContainerStarted","Data":"652a64dc7fb64ba5f266c774add42dc7aede4a98794574de6c44fc9a65426637"} Mar 12 18:33:57 crc kubenswrapper[4926]: I0312 18:33:57.079485 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" event={"ID":"63c9d413-012b-47aa-a519-113685eb478c","Type":"ContainerStarted","Data":"25268059ef3b3f8ab9572b4422f7b00bab2e017511cd7ff4bdb505fd3d2abecf"} Mar 12 18:33:57 crc kubenswrapper[4926]: I0312 18:33:57.100197 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" podStartSLOduration=1.559594038 podStartE2EDuration="2.10016892s" podCreationTimestamp="2026-03-12 18:33:55 +0000 UTC" firstStartedPulling="2026-03-12 18:33:56.028585173 +0000 UTC m=+1876.397211506" lastFinishedPulling="2026-03-12 18:33:56.569160055 +0000 UTC m=+1876.937786388" observedRunningTime="2026-03-12 18:33:57.099091657 +0000 UTC m=+1877.467717990" watchObservedRunningTime="2026-03-12 18:33:57.10016892 +0000 UTC m=+1877.468795273" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.132896 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555674-2b7bv"] Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.134743 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.137940 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.138497 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.138990 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.144090 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555674-2b7bv"] Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.283324 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xj22\" (UniqueName: \"kubernetes.io/projected/540a72de-64b2-41af-9caf-95895f11cb79-kube-api-access-7xj22\") pod \"auto-csr-approver-29555674-2b7bv\" (UID: \"540a72de-64b2-41af-9caf-95895f11cb79\") " pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.399487 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xj22\" (UniqueName: \"kubernetes.io/projected/540a72de-64b2-41af-9caf-95895f11cb79-kube-api-access-7xj22\") pod \"auto-csr-approver-29555674-2b7bv\" (UID: \"540a72de-64b2-41af-9caf-95895f11cb79\") " pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.427926 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xj22\" (UniqueName: \"kubernetes.io/projected/540a72de-64b2-41af-9caf-95895f11cb79-kube-api-access-7xj22\") pod \"auto-csr-approver-29555674-2b7bv\" (UID: \"540a72de-64b2-41af-9caf-95895f11cb79\") " pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.452906 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:00 crc kubenswrapper[4926]: I0312 18:34:00.887055 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555674-2b7bv"] Mar 12 18:34:00 crc kubenswrapper[4926]: W0312 18:34:00.897243 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod540a72de_64b2_41af_9caf_95895f11cb79.slice/crio-845fce3c4ea08b9cf4883148e9b4ea6682a491350bc2bfffb85900ad2ab21538 WatchSource:0}: Error finding container 845fce3c4ea08b9cf4883148e9b4ea6682a491350bc2bfffb85900ad2ab21538: Status 404 returned error can't find the container with id 845fce3c4ea08b9cf4883148e9b4ea6682a491350bc2bfffb85900ad2ab21538 Mar 12 18:34:01 crc kubenswrapper[4926]: I0312 18:34:01.121772 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" event={"ID":"540a72de-64b2-41af-9caf-95895f11cb79","Type":"ContainerStarted","Data":"845fce3c4ea08b9cf4883148e9b4ea6682a491350bc2bfffb85900ad2ab21538"} Mar 12 18:34:01 crc kubenswrapper[4926]: I0312 18:34:01.489572 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:34:01 crc kubenswrapper[4926]: E0312 18:34:01.489919 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:34:06 crc kubenswrapper[4926]: I0312 18:34:06.170812 4926 generic.go:334] "Generic (PLEG): container finished" podID="63c9d413-012b-47aa-a519-113685eb478c" containerID="25268059ef3b3f8ab9572b4422f7b00bab2e017511cd7ff4bdb505fd3d2abecf" exitCode=0 Mar 12 18:34:06 crc kubenswrapper[4926]: I0312 18:34:06.170969 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" event={"ID":"63c9d413-012b-47aa-a519-113685eb478c","Type":"ContainerDied","Data":"25268059ef3b3f8ab9572b4422f7b00bab2e017511cd7ff4bdb505fd3d2abecf"} Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.187759 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" event={"ID":"540a72de-64b2-41af-9caf-95895f11cb79","Type":"ContainerStarted","Data":"804c89c001a28f1eae59ba1c1cb327e0cc2c54749f5a07cd9cd07164cc1b86c4"} Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.216909 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" podStartSLOduration=1.356921442 podStartE2EDuration="7.21687683s" podCreationTimestamp="2026-03-12 18:34:00 +0000 UTC" firstStartedPulling="2026-03-12 18:34:00.899710811 +0000 UTC m=+1881.268337144" lastFinishedPulling="2026-03-12 18:34:06.759666159 +0000 UTC m=+1887.128292532" observedRunningTime="2026-03-12 18:34:07.205110269 +0000 UTC m=+1887.573736602" watchObservedRunningTime="2026-03-12 18:34:07.21687683 +0000 UTC m=+1887.585503203" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.600684 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.743838 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-ssh-key-openstack-edpm-ipam\") pod \"63c9d413-012b-47aa-a519-113685eb478c\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.744019 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-inventory\") pod \"63c9d413-012b-47aa-a519-113685eb478c\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.744086 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8msml\" (UniqueName: \"kubernetes.io/projected/63c9d413-012b-47aa-a519-113685eb478c-kube-api-access-8msml\") pod \"63c9d413-012b-47aa-a519-113685eb478c\" (UID: \"63c9d413-012b-47aa-a519-113685eb478c\") " Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.751539 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c9d413-012b-47aa-a519-113685eb478c-kube-api-access-8msml" (OuterVolumeSpecName: "kube-api-access-8msml") pod "63c9d413-012b-47aa-a519-113685eb478c" (UID: "63c9d413-012b-47aa-a519-113685eb478c"). InnerVolumeSpecName "kube-api-access-8msml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.775952 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-inventory" (OuterVolumeSpecName: "inventory") pod "63c9d413-012b-47aa-a519-113685eb478c" (UID: "63c9d413-012b-47aa-a519-113685eb478c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.779774 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "63c9d413-012b-47aa-a519-113685eb478c" (UID: "63c9d413-012b-47aa-a519-113685eb478c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.847466 4926 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.847521 4926 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63c9d413-012b-47aa-a519-113685eb478c-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 18:34:07 crc kubenswrapper[4926]: I0312 18:34:07.847540 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8msml\" (UniqueName: \"kubernetes.io/projected/63c9d413-012b-47aa-a519-113685eb478c-kube-api-access-8msml\") on node \"crc\" DevicePath \"\"" Mar 12 18:34:08 crc kubenswrapper[4926]: I0312 18:34:08.205163 4926 generic.go:334] "Generic (PLEG): container finished" podID="540a72de-64b2-41af-9caf-95895f11cb79" containerID="804c89c001a28f1eae59ba1c1cb327e0cc2c54749f5a07cd9cd07164cc1b86c4" exitCode=0 Mar 12 18:34:08 crc kubenswrapper[4926]: I0312 18:34:08.205236 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" event={"ID":"540a72de-64b2-41af-9caf-95895f11cb79","Type":"ContainerDied","Data":"804c89c001a28f1eae59ba1c1cb327e0cc2c54749f5a07cd9cd07164cc1b86c4"} Mar 12 18:34:08 crc kubenswrapper[4926]: I0312 18:34:08.213053 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" event={"ID":"63c9d413-012b-47aa-a519-113685eb478c","Type":"ContainerDied","Data":"652a64dc7fb64ba5f266c774add42dc7aede4a98794574de6c44fc9a65426637"} Mar 12 18:34:08 crc kubenswrapper[4926]: I0312 18:34:08.213124 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="652a64dc7fb64ba5f266c774add42dc7aede4a98794574de6c44fc9a65426637" Mar 12 18:34:08 crc kubenswrapper[4926]: I0312 18:34:08.213188 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc" Mar 12 18:34:09 crc kubenswrapper[4926]: I0312 18:34:09.575990 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:09 crc kubenswrapper[4926]: I0312 18:34:09.683595 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xj22\" (UniqueName: \"kubernetes.io/projected/540a72de-64b2-41af-9caf-95895f11cb79-kube-api-access-7xj22\") pod \"540a72de-64b2-41af-9caf-95895f11cb79\" (UID: \"540a72de-64b2-41af-9caf-95895f11cb79\") " Mar 12 18:34:09 crc kubenswrapper[4926]: I0312 18:34:09.691003 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540a72de-64b2-41af-9caf-95895f11cb79-kube-api-access-7xj22" (OuterVolumeSpecName: "kube-api-access-7xj22") pod "540a72de-64b2-41af-9caf-95895f11cb79" (UID: "540a72de-64b2-41af-9caf-95895f11cb79"). InnerVolumeSpecName "kube-api-access-7xj22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:34:09 crc kubenswrapper[4926]: I0312 18:34:09.786260 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xj22\" (UniqueName: \"kubernetes.io/projected/540a72de-64b2-41af-9caf-95895f11cb79-kube-api-access-7xj22\") on node \"crc\" DevicePath \"\"" Mar 12 18:34:10 crc kubenswrapper[4926]: I0312 18:34:10.242259 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" event={"ID":"540a72de-64b2-41af-9caf-95895f11cb79","Type":"ContainerDied","Data":"845fce3c4ea08b9cf4883148e9b4ea6682a491350bc2bfffb85900ad2ab21538"} Mar 12 18:34:10 crc kubenswrapper[4926]: I0312 18:34:10.242648 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="845fce3c4ea08b9cf4883148e9b4ea6682a491350bc2bfffb85900ad2ab21538" Mar 12 18:34:10 crc kubenswrapper[4926]: I0312 18:34:10.242335 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555674-2b7bv" Mar 12 18:34:10 crc kubenswrapper[4926]: I0312 18:34:10.306614 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555668-xnfj4"] Mar 12 18:34:10 crc kubenswrapper[4926]: I0312 18:34:10.317165 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555668-xnfj4"] Mar 12 18:34:10 crc kubenswrapper[4926]: I0312 18:34:10.504221 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddbd98cc-c135-485c-9b18-c31f19321a38" path="/var/lib/kubelet/pods/ddbd98cc-c135-485c-9b18-c31f19321a38/volumes" Mar 12 18:34:12 crc kubenswrapper[4926]: I0312 18:34:12.490638 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:34:12 crc kubenswrapper[4926]: E0312 18:34:12.491329 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:34:16 crc kubenswrapper[4926]: I0312 18:34:16.774466 4926 scope.go:117] "RemoveContainer" containerID="d333e02f3e64bf137ca7da16020fb743166cb48ce41f070ab0d454452957be39" Mar 12 18:34:16 crc kubenswrapper[4926]: I0312 18:34:16.815110 4926 scope.go:117] "RemoveContainer" containerID="1b2fbc6d93b78d1252c9b5b62fe658eafa93cdb86d8acefaa37b868dbfa19698" Mar 12 18:34:16 crc kubenswrapper[4926]: I0312 18:34:16.867311 4926 scope.go:117] "RemoveContainer" containerID="96adbf85f267b1793c500e8c152eb5c14d98f7080edf6f6f116872d3c431cb0a" Mar 12 18:34:16 crc kubenswrapper[4926]: I0312 18:34:16.905138 4926 scope.go:117] "RemoveContainer" containerID="74c22411866d45d292c4082e0d4be48c52ad4683049047a45e2de32d9da3478f" Mar 12 18:34:16 crc kubenswrapper[4926]: I0312 18:34:16.950516 4926 scope.go:117] "RemoveContainer" containerID="5b07bfa58e0c68e62027f67884fd04fdf4f157c10b82889fbed4fe1cd06870d8" Mar 12 18:34:22 crc kubenswrapper[4926]: I0312 18:34:22.049393 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-382f-account-create-update-7sn9z"] Mar 12 18:34:22 crc kubenswrapper[4926]: I0312 18:34:22.062519 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-55g54"] Mar 12 18:34:22 crc kubenswrapper[4926]: I0312 18:34:22.070001 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-382f-account-create-update-7sn9z"] Mar 12 18:34:22 crc kubenswrapper[4926]: I0312 18:34:22.077410 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-55g54"] Mar 12 18:34:22 crc kubenswrapper[4926]: I0312 18:34:22.509722 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb8f27f-336b-4acc-8165-0c53c1643084" path="/var/lib/kubelet/pods/4cb8f27f-336b-4acc-8165-0c53c1643084/volumes" Mar 12 18:34:22 crc kubenswrapper[4926]: I0312 18:34:22.510998 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0c0d35-e463-4b5c-a15f-cdfc808c498c" path="/var/lib/kubelet/pods/fe0c0d35-e463-4b5c-a15f-cdfc808c498c/volumes" Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.038036 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0630-account-create-update-knx9n"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.046494 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f45b-account-create-update-rt9j9"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.057626 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-t4krn"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.067681 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-cctxg"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.075197 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0630-account-create-update-knx9n"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.082491 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f45b-account-create-update-rt9j9"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.090327 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-cctxg"] Mar 12 18:34:23 crc kubenswrapper[4926]: I0312 18:34:23.097818 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-t4krn"] Mar 12 18:34:24 crc kubenswrapper[4926]: I0312 18:34:24.511387 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15db9185-0441-431d-98db-5701f9b244be" path="/var/lib/kubelet/pods/15db9185-0441-431d-98db-5701f9b244be/volumes" Mar 12 18:34:24 crc kubenswrapper[4926]: I0312 18:34:24.513625 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a2618b-1e0a-4246-b63c-582d2fe2f847" path="/var/lib/kubelet/pods/29a2618b-1e0a-4246-b63c-582d2fe2f847/volumes" Mar 12 18:34:24 crc kubenswrapper[4926]: I0312 18:34:24.515223 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b33ff126-1b0c-47f0-a4c1-18e1297fa81d" path="/var/lib/kubelet/pods/b33ff126-1b0c-47f0-a4c1-18e1297fa81d/volumes" Mar 12 18:34:24 crc kubenswrapper[4926]: I0312 18:34:24.516590 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4b3e47-ab9e-46c0-be37-9d2e94721be6" path="/var/lib/kubelet/pods/dc4b3e47-ab9e-46c0-be37-9d2e94721be6/volumes" Mar 12 18:34:25 crc kubenswrapper[4926]: I0312 18:34:25.490154 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:34:25 crc kubenswrapper[4926]: E0312 18:34:25.490755 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:34:39 crc kubenswrapper[4926]: I0312 18:34:39.490795 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:34:39 crc kubenswrapper[4926]: E0312 18:34:39.492100 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:34:48 crc kubenswrapper[4926]: I0312 18:34:48.067638 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cbqv8"] Mar 12 18:34:48 crc kubenswrapper[4926]: I0312 18:34:48.081171 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cbqv8"] Mar 12 18:34:48 crc kubenswrapper[4926]: I0312 18:34:48.503957 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57" path="/var/lib/kubelet/pods/436f0bee-a5ab-4d4f-bbaa-3c4c568f2a57/volumes" Mar 12 18:34:52 crc kubenswrapper[4926]: I0312 18:34:52.491022 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:34:52 crc kubenswrapper[4926]: E0312 18:34:52.492166 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.900822 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nz8z6/must-gather-x4gfw"] Mar 12 18:34:54 crc kubenswrapper[4926]: E0312 18:34:54.902507 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540a72de-64b2-41af-9caf-95895f11cb79" containerName="oc" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.902613 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="540a72de-64b2-41af-9caf-95895f11cb79" containerName="oc" Mar 12 18:34:54 crc kubenswrapper[4926]: E0312 18:34:54.902732 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c9d413-012b-47aa-a519-113685eb478c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.902819 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c9d413-012b-47aa-a519-113685eb478c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.903128 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="540a72de-64b2-41af-9caf-95895f11cb79" containerName="oc" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.903227 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c9d413-012b-47aa-a519-113685eb478c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.904509 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.906562 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nz8z6"/"openshift-service-ca.crt" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.906886 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-nz8z6"/"kube-root-ca.crt" Mar 12 18:34:54 crc kubenswrapper[4926]: I0312 18:34:54.932965 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nz8z6/must-gather-x4gfw"] Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.008086 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2562de93-d8c3-4055-8ab2-3c55b4f3c830-must-gather-output\") pod \"must-gather-x4gfw\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.008165 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44tw\" (UniqueName: \"kubernetes.io/projected/2562de93-d8c3-4055-8ab2-3c55b4f3c830-kube-api-access-w44tw\") pod \"must-gather-x4gfw\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.110861 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2562de93-d8c3-4055-8ab2-3c55b4f3c830-must-gather-output\") pod \"must-gather-x4gfw\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.111229 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2562de93-d8c3-4055-8ab2-3c55b4f3c830-must-gather-output\") pod \"must-gather-x4gfw\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.111425 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44tw\" (UniqueName: \"kubernetes.io/projected/2562de93-d8c3-4055-8ab2-3c55b4f3c830-kube-api-access-w44tw\") pod \"must-gather-x4gfw\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.131084 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44tw\" (UniqueName: \"kubernetes.io/projected/2562de93-d8c3-4055-8ab2-3c55b4f3c830-kube-api-access-w44tw\") pod \"must-gather-x4gfw\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.224904 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.700547 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nz8z6/must-gather-x4gfw"] Mar 12 18:34:55 crc kubenswrapper[4926]: I0312 18:34:55.755092 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" event={"ID":"2562de93-d8c3-4055-8ab2-3c55b4f3c830","Type":"ContainerStarted","Data":"93a53026295ef16f089cdb9826fdb5dbdb0b51d38d9f2aedd4f23b4040bfc0b8"} Mar 12 18:35:02 crc kubenswrapper[4926]: I0312 18:35:02.847730 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" event={"ID":"2562de93-d8c3-4055-8ab2-3c55b4f3c830","Type":"ContainerStarted","Data":"9983eed33b5073c82867ffe643c25901fb02264d34aa9779308efc3e4497884a"} Mar 12 18:35:02 crc kubenswrapper[4926]: I0312 18:35:02.848384 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" event={"ID":"2562de93-d8c3-4055-8ab2-3c55b4f3c830","Type":"ContainerStarted","Data":"4dbea05e4f0fb083c128a04cb1592adbe2b7259461b2a690428955b1ed545c5d"} Mar 12 18:35:02 crc kubenswrapper[4926]: I0312 18:35:02.891207 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" podStartSLOduration=2.944702807 podStartE2EDuration="8.891185854s" podCreationTimestamp="2026-03-12 18:34:54 +0000 UTC" firstStartedPulling="2026-03-12 18:34:55.70678955 +0000 UTC m=+1936.075415913" lastFinishedPulling="2026-03-12 18:35:01.653272627 +0000 UTC m=+1942.021898960" observedRunningTime="2026-03-12 18:35:02.881807678 +0000 UTC m=+1943.250434041" watchObservedRunningTime="2026-03-12 18:35:02.891185854 +0000 UTC m=+1943.259812187" Mar 12 18:35:04 crc kubenswrapper[4926]: I0312 18:35:04.490303 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:35:04 crc kubenswrapper[4926]: E0312 18:35:04.490887 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.304269 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nz8z6/crc-debug-ss92r"] Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.306197 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.309509 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nz8z6"/"default-dockercfg-9j8mc" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.448095 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h879n\" (UniqueName: \"kubernetes.io/projected/5381bdf5-d86c-4da5-ac97-b5a3a1997005-kube-api-access-h879n\") pod \"crc-debug-ss92r\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.448582 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5381bdf5-d86c-4da5-ac97-b5a3a1997005-host\") pod \"crc-debug-ss92r\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.551064 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5381bdf5-d86c-4da5-ac97-b5a3a1997005-host\") pod \"crc-debug-ss92r\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.551132 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h879n\" (UniqueName: \"kubernetes.io/projected/5381bdf5-d86c-4da5-ac97-b5a3a1997005-kube-api-access-h879n\") pod \"crc-debug-ss92r\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.551268 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5381bdf5-d86c-4da5-ac97-b5a3a1997005-host\") pod \"crc-debug-ss92r\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.580069 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h879n\" (UniqueName: \"kubernetes.io/projected/5381bdf5-d86c-4da5-ac97-b5a3a1997005-kube-api-access-h879n\") pod \"crc-debug-ss92r\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.626112 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:06 crc kubenswrapper[4926]: W0312 18:35:06.671557 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5381bdf5_d86c_4da5_ac97_b5a3a1997005.slice/crio-8650f457341f9acefdfbaa4223783f6fe0533785330f983f51126ef0fa194e76 WatchSource:0}: Error finding container 8650f457341f9acefdfbaa4223783f6fe0533785330f983f51126ef0fa194e76: Status 404 returned error can't find the container with id 8650f457341f9acefdfbaa4223783f6fe0533785330f983f51126ef0fa194e76 Mar 12 18:35:06 crc kubenswrapper[4926]: I0312 18:35:06.890196 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" event={"ID":"5381bdf5-d86c-4da5-ac97-b5a3a1997005","Type":"ContainerStarted","Data":"8650f457341f9acefdfbaa4223783f6fe0533785330f983f51126ef0fa194e76"} Mar 12 18:35:10 crc kubenswrapper[4926]: I0312 18:35:10.032958 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-88dpb"] Mar 12 18:35:10 crc kubenswrapper[4926]: I0312 18:35:10.050022 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-88dpb"] Mar 12 18:35:10 crc kubenswrapper[4926]: I0312 18:35:10.500792 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30718d9-b986-4490-9e43-56eaa459aeb5" path="/var/lib/kubelet/pods/d30718d9-b986-4490-9e43-56eaa459aeb5/volumes" Mar 12 18:35:11 crc kubenswrapper[4926]: I0312 18:35:11.041413 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2wcs"] Mar 12 18:35:11 crc kubenswrapper[4926]: I0312 18:35:11.048049 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-x2wcs"] Mar 12 18:35:12 crc kubenswrapper[4926]: I0312 18:35:12.503279 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b478ba05-2155-4b13-a58a-002be25403d0" path="/var/lib/kubelet/pods/b478ba05-2155-4b13-a58a-002be25403d0/volumes" Mar 12 18:35:17 crc kubenswrapper[4926]: I0312 18:35:17.071467 4926 scope.go:117] "RemoveContainer" containerID="a9bf2ae52889d181443948b724481f92343c2befd9093bc8cc132900dface592" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.586013 4926 scope.go:117] "RemoveContainer" containerID="6afc876661934d69ed6bb90885f853f8b5249b4339a13cab5b59071dc578e401" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.628825 4926 scope.go:117] "RemoveContainer" containerID="976b5bff6bf1beb72cdab8caedbb157ee653f67334ae07735bc5919349b4ae10" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.678712 4926 scope.go:117] "RemoveContainer" containerID="683758cbf77fef46bca458acf7d282b0867f1aa1a065fbc9925548f2b1c73316" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.740477 4926 scope.go:117] "RemoveContainer" containerID="2f4d726cd118a123d7b32175fc4e520ff68fb0c877630e02cc5267fde94f47a0" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.836986 4926 scope.go:117] "RemoveContainer" containerID="14cbad167526774b9aa42516a514a2cffabd952a439286d5c8bfe698bbd81e8b" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.888310 4926 scope.go:117] "RemoveContainer" containerID="8476342456babac729c7086b287eb9b60e559eb560bedd1d78d42784e15f0ff2" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.906693 4926 scope.go:117] "RemoveContainer" containerID="f80b93ed7cbf010c51c1bbf4a01b83fdc0dc54adf44c99408ef3f7476eea541b" Mar 12 18:35:18 crc kubenswrapper[4926]: I0312 18:35:18.930729 4926 scope.go:117] "RemoveContainer" containerID="b0665b4cdb9e47cd07d74cad2d92a7eeb8aadf990846b2f50cc7f0a3a493d8dd" Mar 12 18:35:19 crc kubenswrapper[4926]: I0312 18:35:19.046966 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" event={"ID":"5381bdf5-d86c-4da5-ac97-b5a3a1997005","Type":"ContainerStarted","Data":"3b7cae5f88beda72e15102b000ae8d919d1219da1f42e5b5be4f546f7fa97f6c"} Mar 12 18:35:19 crc kubenswrapper[4926]: I0312 18:35:19.063952 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" podStartSLOduration=1.05588493 podStartE2EDuration="13.063930328s" podCreationTimestamp="2026-03-12 18:35:06 +0000 UTC" firstStartedPulling="2026-03-12 18:35:06.67388103 +0000 UTC m=+1947.042507363" lastFinishedPulling="2026-03-12 18:35:18.681926428 +0000 UTC m=+1959.050552761" observedRunningTime="2026-03-12 18:35:19.061777631 +0000 UTC m=+1959.430403974" watchObservedRunningTime="2026-03-12 18:35:19.063930328 +0000 UTC m=+1959.432556661" Mar 12 18:35:19 crc kubenswrapper[4926]: I0312 18:35:19.489987 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:35:19 crc kubenswrapper[4926]: E0312 18:35:19.490207 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:35:25 crc kubenswrapper[4926]: I0312 18:35:25.033818 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vwzff"] Mar 12 18:35:25 crc kubenswrapper[4926]: I0312 18:35:25.045977 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vwzff"] Mar 12 18:35:26 crc kubenswrapper[4926]: I0312 18:35:26.502827 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a36d0b-fd08-4c93-846c-688c71055113" path="/var/lib/kubelet/pods/76a36d0b-fd08-4c93-846c-688c71055113/volumes" Mar 12 18:35:33 crc kubenswrapper[4926]: I0312 18:35:33.202452 4926 generic.go:334] "Generic (PLEG): container finished" podID="5381bdf5-d86c-4da5-ac97-b5a3a1997005" containerID="3b7cae5f88beda72e15102b000ae8d919d1219da1f42e5b5be4f546f7fa97f6c" exitCode=0 Mar 12 18:35:33 crc kubenswrapper[4926]: I0312 18:35:33.202506 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" event={"ID":"5381bdf5-d86c-4da5-ac97-b5a3a1997005","Type":"ContainerDied","Data":"3b7cae5f88beda72e15102b000ae8d919d1219da1f42e5b5be4f546f7fa97f6c"} Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.335788 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.368501 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nz8z6/crc-debug-ss92r"] Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.373740 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nz8z6/crc-debug-ss92r"] Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.384197 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h879n\" (UniqueName: \"kubernetes.io/projected/5381bdf5-d86c-4da5-ac97-b5a3a1997005-kube-api-access-h879n\") pod \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.384478 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5381bdf5-d86c-4da5-ac97-b5a3a1997005-host\") pod \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\" (UID: \"5381bdf5-d86c-4da5-ac97-b5a3a1997005\") " Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.384658 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5381bdf5-d86c-4da5-ac97-b5a3a1997005-host" (OuterVolumeSpecName: "host") pod "5381bdf5-d86c-4da5-ac97-b5a3a1997005" (UID: "5381bdf5-d86c-4da5-ac97-b5a3a1997005"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.384936 4926 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5381bdf5-d86c-4da5-ac97-b5a3a1997005-host\") on node \"crc\" DevicePath \"\"" Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.388699 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5381bdf5-d86c-4da5-ac97-b5a3a1997005-kube-api-access-h879n" (OuterVolumeSpecName: "kube-api-access-h879n") pod "5381bdf5-d86c-4da5-ac97-b5a3a1997005" (UID: "5381bdf5-d86c-4da5-ac97-b5a3a1997005"). InnerVolumeSpecName "kube-api-access-h879n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.486233 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h879n\" (UniqueName: \"kubernetes.io/projected/5381bdf5-d86c-4da5-ac97-b5a3a1997005-kube-api-access-h879n\") on node \"crc\" DevicePath \"\"" Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.489605 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:35:34 crc kubenswrapper[4926]: E0312 18:35:34.489902 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:35:34 crc kubenswrapper[4926]: I0312 18:35:34.499779 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5381bdf5-d86c-4da5-ac97-b5a3a1997005" path="/var/lib/kubelet/pods/5381bdf5-d86c-4da5-ac97-b5a3a1997005/volumes" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.220294 4926 scope.go:117] "RemoveContainer" containerID="3b7cae5f88beda72e15102b000ae8d919d1219da1f42e5b5be4f546f7fa97f6c" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.220395 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-ss92r" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.574155 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nz8z6/crc-debug-6jp9x"] Mar 12 18:35:35 crc kubenswrapper[4926]: E0312 18:35:35.574513 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5381bdf5-d86c-4da5-ac97-b5a3a1997005" containerName="container-00" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.574525 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="5381bdf5-d86c-4da5-ac97-b5a3a1997005" containerName="container-00" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.574690 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="5381bdf5-d86c-4da5-ac97-b5a3a1997005" containerName="container-00" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.575240 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.578286 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-nz8z6"/"default-dockercfg-9j8mc" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.606358 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ffd21c-99e8-47f0-945a-70fb2da71885-host\") pod \"crc-debug-6jp9x\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.606506 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlff\" (UniqueName: \"kubernetes.io/projected/23ffd21c-99e8-47f0-945a-70fb2da71885-kube-api-access-lhlff\") pod \"crc-debug-6jp9x\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.715889 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ffd21c-99e8-47f0-945a-70fb2da71885-host\") pod \"crc-debug-6jp9x\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.716007 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlff\" (UniqueName: \"kubernetes.io/projected/23ffd21c-99e8-47f0-945a-70fb2da71885-kube-api-access-lhlff\") pod \"crc-debug-6jp9x\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.716050 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ffd21c-99e8-47f0-945a-70fb2da71885-host\") pod \"crc-debug-6jp9x\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.734194 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlff\" (UniqueName: \"kubernetes.io/projected/23ffd21c-99e8-47f0-945a-70fb2da71885-kube-api-access-lhlff\") pod \"crc-debug-6jp9x\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:35 crc kubenswrapper[4926]: I0312 18:35:35.891709 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:36 crc kubenswrapper[4926]: I0312 18:35:36.235707 4926 generic.go:334] "Generic (PLEG): container finished" podID="23ffd21c-99e8-47f0-945a-70fb2da71885" containerID="c1196ac145f33bde3cf7ef54fd8462b66ce61b83f03e51afe39dafbe2872f728" exitCode=1 Mar 12 18:35:36 crc kubenswrapper[4926]: I0312 18:35:36.235835 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" event={"ID":"23ffd21c-99e8-47f0-945a-70fb2da71885","Type":"ContainerDied","Data":"c1196ac145f33bde3cf7ef54fd8462b66ce61b83f03e51afe39dafbe2872f728"} Mar 12 18:35:36 crc kubenswrapper[4926]: I0312 18:35:36.236028 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" event={"ID":"23ffd21c-99e8-47f0-945a-70fb2da71885","Type":"ContainerStarted","Data":"fd73f0549d58a377ffca80b1bd0fbac707795e9afe2863c99dc4b3bf9df893a3"} Mar 12 18:35:36 crc kubenswrapper[4926]: I0312 18:35:36.279646 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nz8z6/crc-debug-6jp9x"] Mar 12 18:35:36 crc kubenswrapper[4926]: I0312 18:35:36.288422 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nz8z6/crc-debug-6jp9x"] Mar 12 18:35:36 crc kubenswrapper[4926]: E0312 18:35:36.342697 4926 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23ffd21c_99e8_47f0_945a_70fb2da71885.slice/crio-conmon-c1196ac145f33bde3cf7ef54fd8462b66ce61b83f03e51afe39dafbe2872f728.scope\": RecentStats: unable to find data in memory cache]" Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.347713 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.449031 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ffd21c-99e8-47f0-945a-70fb2da71885-host\") pod \"23ffd21c-99e8-47f0-945a-70fb2da71885\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.449126 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23ffd21c-99e8-47f0-945a-70fb2da71885-host" (OuterVolumeSpecName: "host") pod "23ffd21c-99e8-47f0-945a-70fb2da71885" (UID: "23ffd21c-99e8-47f0-945a-70fb2da71885"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.449652 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhlff\" (UniqueName: \"kubernetes.io/projected/23ffd21c-99e8-47f0-945a-70fb2da71885-kube-api-access-lhlff\") pod \"23ffd21c-99e8-47f0-945a-70fb2da71885\" (UID: \"23ffd21c-99e8-47f0-945a-70fb2da71885\") " Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.450121 4926 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23ffd21c-99e8-47f0-945a-70fb2da71885-host\") on node \"crc\" DevicePath \"\"" Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.455263 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ffd21c-99e8-47f0-945a-70fb2da71885-kube-api-access-lhlff" (OuterVolumeSpecName: "kube-api-access-lhlff") pod "23ffd21c-99e8-47f0-945a-70fb2da71885" (UID: "23ffd21c-99e8-47f0-945a-70fb2da71885"). InnerVolumeSpecName "kube-api-access-lhlff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:35:37 crc kubenswrapper[4926]: I0312 18:35:37.551578 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhlff\" (UniqueName: \"kubernetes.io/projected/23ffd21c-99e8-47f0-945a-70fb2da71885-kube-api-access-lhlff\") on node \"crc\" DevicePath \"\"" Mar 12 18:35:38 crc kubenswrapper[4926]: I0312 18:35:38.257583 4926 scope.go:117] "RemoveContainer" containerID="c1196ac145f33bde3cf7ef54fd8462b66ce61b83f03e51afe39dafbe2872f728" Mar 12 18:35:38 crc kubenswrapper[4926]: I0312 18:35:38.257704 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/crc-debug-6jp9x" Mar 12 18:35:38 crc kubenswrapper[4926]: I0312 18:35:38.501790 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ffd21c-99e8-47f0-945a-70fb2da71885" path="/var/lib/kubelet/pods/23ffd21c-99e8-47f0-945a-70fb2da71885/volumes" Mar 12 18:35:46 crc kubenswrapper[4926]: I0312 18:35:46.490644 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:35:46 crc kubenswrapper[4926]: E0312 18:35:46.491512 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:35:59 crc kubenswrapper[4926]: I0312 18:35:59.490481 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:35:59 crc kubenswrapper[4926]: E0312 18:35:59.491204 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.149017 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555676-5d5p5"] Mar 12 18:36:00 crc kubenswrapper[4926]: E0312 18:36:00.149490 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ffd21c-99e8-47f0-945a-70fb2da71885" containerName="container-00" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.149519 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ffd21c-99e8-47f0-945a-70fb2da71885" containerName="container-00" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.149732 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ffd21c-99e8-47f0-945a-70fb2da71885" containerName="container-00" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.151530 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.154512 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.155083 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.156753 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.187597 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555676-5d5p5"] Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.277975 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tqr\" (UniqueName: \"kubernetes.io/projected/f1b045ea-1fff-43b3-9a66-48dd361e9f33-kube-api-access-v6tqr\") pod \"auto-csr-approver-29555676-5d5p5\" (UID: \"f1b045ea-1fff-43b3-9a66-48dd361e9f33\") " pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.380825 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tqr\" (UniqueName: \"kubernetes.io/projected/f1b045ea-1fff-43b3-9a66-48dd361e9f33-kube-api-access-v6tqr\") pod \"auto-csr-approver-29555676-5d5p5\" (UID: \"f1b045ea-1fff-43b3-9a66-48dd361e9f33\") " pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.404043 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tqr\" (UniqueName: \"kubernetes.io/projected/f1b045ea-1fff-43b3-9a66-48dd361e9f33-kube-api-access-v6tqr\") pod \"auto-csr-approver-29555676-5d5p5\" (UID: \"f1b045ea-1fff-43b3-9a66-48dd361e9f33\") " pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.473709 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:00 crc kubenswrapper[4926]: I0312 18:36:00.931014 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555676-5d5p5"] Mar 12 18:36:01 crc kubenswrapper[4926]: I0312 18:36:01.469480 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" event={"ID":"f1b045ea-1fff-43b3-9a66-48dd361e9f33","Type":"ContainerStarted","Data":"89b811e4e20a47fa80d513af23713e88bc0b9d51b9526b50771abdfdfd97db80"} Mar 12 18:36:02 crc kubenswrapper[4926]: I0312 18:36:02.480781 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" event={"ID":"f1b045ea-1fff-43b3-9a66-48dd361e9f33","Type":"ContainerStarted","Data":"cfe5fa51265434fa1414bed949d3e915d15d11d193d0cf363993fcfab40b91d4"} Mar 12 18:36:02 crc kubenswrapper[4926]: I0312 18:36:02.497510 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" podStartSLOduration=1.337521948 podStartE2EDuration="2.497494259s" podCreationTimestamp="2026-03-12 18:36:00 +0000 UTC" firstStartedPulling="2026-03-12 18:36:00.936511179 +0000 UTC m=+2001.305137532" lastFinishedPulling="2026-03-12 18:36:02.09648351 +0000 UTC m=+2002.465109843" observedRunningTime="2026-03-12 18:36:02.496195318 +0000 UTC m=+2002.864821651" watchObservedRunningTime="2026-03-12 18:36:02.497494259 +0000 UTC m=+2002.866120592" Mar 12 18:36:03 crc kubenswrapper[4926]: I0312 18:36:03.489786 4926 generic.go:334] "Generic (PLEG): container finished" podID="f1b045ea-1fff-43b3-9a66-48dd361e9f33" containerID="cfe5fa51265434fa1414bed949d3e915d15d11d193d0cf363993fcfab40b91d4" exitCode=0 Mar 12 18:36:03 crc kubenswrapper[4926]: I0312 18:36:03.489909 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" event={"ID":"f1b045ea-1fff-43b3-9a66-48dd361e9f33","Type":"ContainerDied","Data":"cfe5fa51265434fa1414bed949d3e915d15d11d193d0cf363993fcfab40b91d4"} Mar 12 18:36:04 crc kubenswrapper[4926]: I0312 18:36:04.904655 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:04 crc kubenswrapper[4926]: I0312 18:36:04.995987 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tqr\" (UniqueName: \"kubernetes.io/projected/f1b045ea-1fff-43b3-9a66-48dd361e9f33-kube-api-access-v6tqr\") pod \"f1b045ea-1fff-43b3-9a66-48dd361e9f33\" (UID: \"f1b045ea-1fff-43b3-9a66-48dd361e9f33\") " Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.015686 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1b045ea-1fff-43b3-9a66-48dd361e9f33-kube-api-access-v6tqr" (OuterVolumeSpecName: "kube-api-access-v6tqr") pod "f1b045ea-1fff-43b3-9a66-48dd361e9f33" (UID: "f1b045ea-1fff-43b3-9a66-48dd361e9f33"). InnerVolumeSpecName "kube-api-access-v6tqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.098561 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6tqr\" (UniqueName: \"kubernetes.io/projected/f1b045ea-1fff-43b3-9a66-48dd361e9f33-kube-api-access-v6tqr\") on node \"crc\" DevicePath \"\"" Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.508107 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" event={"ID":"f1b045ea-1fff-43b3-9a66-48dd361e9f33","Type":"ContainerDied","Data":"89b811e4e20a47fa80d513af23713e88bc0b9d51b9526b50771abdfdfd97db80"} Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.508163 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b811e4e20a47fa80d513af23713e88bc0b9d51b9526b50771abdfdfd97db80" Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.508250 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555676-5d5p5" Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.575988 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555670-v8lqv"] Mar 12 18:36:05 crc kubenswrapper[4926]: I0312 18:36:05.587781 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555670-v8lqv"] Mar 12 18:36:06 crc kubenswrapper[4926]: I0312 18:36:06.504943 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa768ee-1644-40fa-8f52-790b28d9bb74" path="/var/lib/kubelet/pods/daa768ee-1644-40fa-8f52-790b28d9bb74/volumes" Mar 12 18:36:11 crc kubenswrapper[4926]: I0312 18:36:11.490048 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:36:11 crc kubenswrapper[4926]: E0312 18:36:11.490899 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:36:15 crc kubenswrapper[4926]: I0312 18:36:15.786452 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fd6447dc6-7dg85_32458d94-2727-4718-a842-f20e68b6a0dd/barbican-api/0.log" Mar 12 18:36:15 crc kubenswrapper[4926]: I0312 18:36:15.955992 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6fd6447dc6-7dg85_32458d94-2727-4718-a842-f20e68b6a0dd/barbican-api-log/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.000377 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74668b896d-mctls_4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92/barbican-keystone-listener/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.024517 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74668b896d-mctls_4cce8ba3-9eb9-4ed4-a3bf-a6787fdaae92/barbican-keystone-listener-log/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.189904 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fddddf9f9-kbftb_08cef0ec-16bc-4b64-95f6-e0d8f22fa00e/barbican-worker/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.205296 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-fddddf9f9-kbftb_08cef0ec-16bc-4b64-95f6-e0d8f22fa00e/barbican-worker-log/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.383996 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f983b88e-aba3-4d49-bbd4-4db5eef5266c/ceilometer-central-agent/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.446922 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-88k6z_75a3208b-42f5-412e-a503-ac328f7d9967/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.474103 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f983b88e-aba3-4d49-bbd4-4db5eef5266c/ceilometer-notification-agent/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.562788 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f983b88e-aba3-4d49-bbd4-4db5eef5266c/proxy-httpd/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.626336 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f983b88e-aba3-4d49-bbd4-4db5eef5266c/sg-core/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.656562 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-fm77w_1891d243-9edd-48aa-88ff-f943dc337e8d/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.837083 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_edd14509-9e82-40a2-aea4-c6ad4250be05/cinder-api/0.log" Mar 12 18:36:16 crc kubenswrapper[4926]: I0312 18:36:16.846394 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_edd14509-9e82-40a2-aea4-c6ad4250be05/cinder-api-log/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.021706 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7b586d98-12e7-4814-82c6-6724e1b35a77/cinder-scheduler/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.053693 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7b586d98-12e7-4814-82c6-6724e1b35a77/probe/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.177900 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-h2dvq_eab3d0c2-5edc-4657-928f-52a87de2293a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.240338 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-2d679_2fd86e0a-1dc5-4f07-96fb-e9d15b32cca3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.434531 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fc8b56cf-6cbwx_6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69/init/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.598537 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fc8b56cf-6cbwx_6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69/dnsmasq-dns/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.624531 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fc8b56cf-6cbwx_6f42a0ec-9b43-4a24-b5b2-89cd0f3abe69/init/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.643193 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0749ce08-7fa5-48fe-9248-ac3a6699ef57/glance-httpd/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.822836 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e75a0d1-4272-43f3-b1b8-3cfe57e0141d/glance-httpd/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.836599 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0749ce08-7fa5-48fe-9248-ac3a6699ef57/glance-log/0.log" Mar 12 18:36:17 crc kubenswrapper[4926]: I0312 18:36:17.868040 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1e75a0d1-4272-43f3-b1b8-3cfe57e0141d/glance-log/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.126540 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c6848d8cd-cq57n_a1ae8f23-3518-430a-bbcf-e7be0cb8282e/horizon/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.146115 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c6848d8cd-cq57n_a1ae8f23-3518-430a-bbcf-e7be0cb8282e/horizon-log/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.289739 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-njwrr_7f4f99b4-e1a5-4b29-b5e4-c4fcd240ca46/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.426238 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b667c464b-fk8sc_054d27f0-5c9b-4e59-98b3-e05609c3b257/keystone-api/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.496329 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_30b7fb7c-fcab-4551-8284-c0dab53beb21/kube-state-metrics/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.739379 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7fc6496dbc-7qwrw_bf1502ef-50a4-45e0-b193-a6e25abccb32/neutron-api/0.log" Mar 12 18:36:18 crc kubenswrapper[4926]: I0312 18:36:18.852128 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7fc6496dbc-7qwrw_bf1502ef-50a4-45e0-b193-a6e25abccb32/neutron-httpd/0.log" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.109915 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_65b09111-c033-45e3-97d3-cd755e1a79ab/nova-api-api/0.log" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.145641 4926 scope.go:117] "RemoveContainer" containerID="ddcfcec936ed5e6887603a245e014b4f9336991c54bce882c0b7657f1d5a318b" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.184364 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_65b09111-c033-45e3-97d3-cd755e1a79ab/nova-api-log/0.log" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.199128 4926 scope.go:117] "RemoveContainer" containerID="799ca781836933eed5f381ea7d6b69d2905b6867e2a978eb9f38bd2b549ed6cb" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.450592 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_397b3426-1e6c-468a-8d6d-562e70944d9d/nova-cell0-conductor-conductor/0.log" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.504397 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9696ed5e-263b-457b-a771-9f7a7e27dbed/nova-cell1-conductor-conductor/0.log" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.724166 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5990afe5-179c-401d-99eb-58b27e2bfc9e/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 18:36:19 crc kubenswrapper[4926]: I0312 18:36:19.788553 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8065d406-f127-4c57-b603-e7e6afeb3731/nova-metadata-log/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.045133 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6d9c2983-b118-4677-ba18-20531d4223ad/nova-scheduler-scheduler/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.070149 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8065d406-f127-4c57-b603-e7e6afeb3731/nova-metadata-metadata/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.187512 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_94d38c16-a6c9-44ed-a49e-398dc34b92ce/mysql-bootstrap/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.334094 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_94d38c16-a6c9-44ed-a49e-398dc34b92ce/mysql-bootstrap/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.416898 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1ee00086-3c8a-4f3a-a5d5-9590715a8b95/mysql-bootstrap/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.427172 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_94d38c16-a6c9-44ed-a49e-398dc34b92ce/galera/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.562161 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1ee00086-3c8a-4f3a-a5d5-9590715a8b95/mysql-bootstrap/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.644280 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1ee00086-3c8a-4f3a-a5d5-9590715a8b95/galera/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.658263 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8ac368f3-42fb-4f4a-ba68-1686386b017e/openstackclient/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.838533 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mszqk_b4a45633-8ac7-497f-a8d7-2b7a3dad35bc/openstack-network-exporter/0.log" Mar 12 18:36:20 crc kubenswrapper[4926]: I0312 18:36:20.942600 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6znbv_30dfa384-92a5-49cf-9793-60478855264f/ovsdb-server-init/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.046120 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6znbv_30dfa384-92a5-49cf-9793-60478855264f/ovs-vswitchd/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.094391 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6znbv_30dfa384-92a5-49cf-9793-60478855264f/ovsdb-server/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.116048 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-6znbv_30dfa384-92a5-49cf-9793-60478855264f/ovsdb-server-init/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.288372 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sfwpr_8f19fbb7-ea3b-437a-a634-498e6a593ef6/ovn-controller/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.360284 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3ffbc2f0-cbec-43d4-9907-a95af037ae1b/openstack-network-exporter/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.382462 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3ffbc2f0-cbec-43d4-9907-a95af037ae1b/ovn-northd/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.530683 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c3288572-a9dc-4f96-8535-b05f6f22855b/ovsdbserver-nb/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.531628 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c3288572-a9dc-4f96-8535-b05f6f22855b/openstack-network-exporter/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.719993 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8926ccac-b553-4e37-bbcb-96e3b00c1cab/ovsdbserver-sb/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.806151 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8926ccac-b553-4e37-bbcb-96e3b00c1cab/openstack-network-exporter/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.861390 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6fdd856968-nmxxn_0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81/placement-api/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.941193 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6fdd856968-nmxxn_0c666ce1-e9f9-480f-a2eb-91c2cb9e7d81/placement-log/0.log" Mar 12 18:36:21 crc kubenswrapper[4926]: I0312 18:36:21.984949 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b9b82f03-7ac1-4805-858b-708760b4e476/setup-container/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.300900 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b9b82f03-7ac1-4805-858b-708760b4e476/rabbitmq/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.326684 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b9b82f03-7ac1-4805-858b-708760b4e476/setup-container/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.370284 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9aba0434-585e-4355-8019-1612400b2350/setup-container/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.530165 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9aba0434-585e-4355-8019-1612400b2350/rabbitmq/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.578239 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9aba0434-585e-4355-8019-1612400b2350/setup-container/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.595826 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-5c2tc_63c9d413-012b-47aa-a519-113685eb478c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:22 crc kubenswrapper[4926]: I0312 18:36:22.988964 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-g45rs_58e2a40e-84ba-43fe-8087-ec2e917a2b34/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.035273 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zxwll_f12fab7b-7f88-441e-b230-551dc9ffa270/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.214389 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-hckzq_d48eb883-81b1-4ad6-bbbe-f9c5a9779fde/ssh-known-hosts-edpm-deployment/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.298949 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bbdc94bc7-pmtmm_09eb7b3b-c5af-4625-8f1a-83766550711c/proxy-httpd/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.474632 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bbdc94bc7-pmtmm_09eb7b3b-c5af-4625-8f1a-83766550711c/proxy-server/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.489615 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:36:23 crc kubenswrapper[4926]: E0312 18:36:23.489884 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.538235 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-bn6gf_9e8d5f15-0f8f-4e98-a9fc-cf0925c0990f/swift-ring-rebalance/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.664400 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/account-auditor/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.702831 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/account-reaper/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.752167 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/account-server/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.806450 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/account-replicator/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.868650 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/container-auditor/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.933337 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/container-replicator/0.log" Mar 12 18:36:23 crc kubenswrapper[4926]: I0312 18:36:23.970942 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/container-updater/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.006807 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/container-server/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.084341 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/object-auditor/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.142879 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/object-expirer/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.181750 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/object-replicator/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.260307 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/object-server/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.331046 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/object-updater/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.366882 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/rsync/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.419393 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_57853681-32de-4475-9c7d-3f9708fe7d91/swift-recon-cron/0.log" Mar 12 18:36:24 crc kubenswrapper[4926]: I0312 18:36:24.570899 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-55zdj_96a114de-73e3-4088-902e-c3d1fcaaa3ad/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 18:36:28 crc kubenswrapper[4926]: I0312 18:36:28.525408 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_69083379-a7d7-4876-9955-497420eab579/memcached/0.log" Mar 12 18:36:35 crc kubenswrapper[4926]: I0312 18:36:35.489680 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:36:35 crc kubenswrapper[4926]: E0312 18:36:35.490446 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.246379 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/util/0.log" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.343898 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/util/0.log" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.425361 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/pull/0.log" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.448111 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/pull/0.log" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.600562 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/util/0.log" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.627583 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/pull/0.log" Mar 12 18:36:47 crc kubenswrapper[4926]: I0312 18:36:47.647333 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0a9cf9d9b218433ba8d40c603f7c72428a887310de9e6eb6a71674eecedmpzt_1a47463c-c539-4a5c-a3fd-c09feef2de67/extract/0.log" Mar 12 18:36:48 crc kubenswrapper[4926]: I0312 18:36:48.038501 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-clngf_4800992f-cfad-4a1c-94e5-79427f88c002/manager/0.log" Mar 12 18:36:48 crc kubenswrapper[4926]: I0312 18:36:48.372542 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-m6sxh_b776be98-1352-43c6-8ee8-e31076b7d12b/manager/0.log" Mar 12 18:36:48 crc kubenswrapper[4926]: I0312 18:36:48.633975 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-525z5_fea71415-42ac-4e77-ba9c-25170ccece27/manager/0.log" Mar 12 18:36:48 crc kubenswrapper[4926]: I0312 18:36:48.830924 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-p76mx_065fe73a-651c-4cd3-b8d7-135617c51bbd/manager/0.log" Mar 12 18:36:49 crc kubenswrapper[4926]: I0312 18:36:49.176886 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-zlsrd_f112cb87-7454-41fa-a1e1-381d79f86247/manager/0.log" Mar 12 18:36:49 crc kubenswrapper[4926]: I0312 18:36:49.356030 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-7q8xr_a41da562-a119-4785-95d0-eaf0970a99f4/manager/0.log" Mar 12 18:36:49 crc kubenswrapper[4926]: I0312 18:36:49.508534 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-kzl7p_fd35525c-7b73-49d1-a36c-c49d3bf933eb/manager/0.log" Mar 12 18:36:49 crc kubenswrapper[4926]: I0312 18:36:49.700245 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-5f6c4_c62baaa0-d72b-4240-9dba-858bdf61d1b3/manager/0.log" Mar 12 18:36:49 crc kubenswrapper[4926]: I0312 18:36:49.770879 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-f96w7_e38cf931-bbd6-4b47-bdf6-8a514d17d3d7/manager/0.log" Mar 12 18:36:50 crc kubenswrapper[4926]: I0312 18:36:50.010518 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-c58lh_ccd465e8-6811-4865-9602-3dea8144cc01/manager/0.log" Mar 12 18:36:50 crc kubenswrapper[4926]: I0312 18:36:50.331419 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-njrjb_6374b396-00ef-4aca-ac07-fd46982f23f1/manager/0.log" Mar 12 18:36:50 crc kubenswrapper[4926]: I0312 18:36:50.466140 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-dzj5w_2e44c177-b87d-4ff6-80ca-672477fe9e94/manager/0.log" Mar 12 18:36:50 crc kubenswrapper[4926]: I0312 18:36:50.497307 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:36:50 crc kubenswrapper[4926]: E0312 18:36:50.497614 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:36:50 crc kubenswrapper[4926]: I0312 18:36:50.596607 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-cv97b_464725b8-2734-43a4-a232-5db9bafed311/manager/0.log" Mar 12 18:36:50 crc kubenswrapper[4926]: I0312 18:36:50.798519 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7cxtpk_5d4dea90-1696-4195-a0a0-71a3c9f3e328/manager/0.log" Mar 12 18:36:51 crc kubenswrapper[4926]: I0312 18:36:51.498068 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-66f4595798-c276m_fe4ee666-a3c1-44c5-a07b-5ca8438e0482/operator/0.log" Mar 12 18:36:51 crc kubenswrapper[4926]: I0312 18:36:51.674361 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hwc2c_66ae05ba-2fc2-4915-9899-083a49295427/registry-server/0.log" Mar 12 18:36:51 crc kubenswrapper[4926]: I0312 18:36:51.853836 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-xr747_14346445-95be-488f-858c-44bf5b45c656/manager/0.log" Mar 12 18:36:51 crc kubenswrapper[4926]: I0312 18:36:51.981187 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-tpnlt_e3baa344-9dd8-48ed-8b6a-60ff9fbc181a/manager/0.log" Mar 12 18:36:52 crc kubenswrapper[4926]: I0312 18:36:52.147397 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wqzk9_fc6165e6-3d8f-4ddd-b6ae-a1307f2c6b3b/operator/0.log" Mar 12 18:36:52 crc kubenswrapper[4926]: I0312 18:36:52.363125 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-r2cxk_309d4a2a-f738-4d2d-a28e-f361f762f997/manager/0.log" Mar 12 18:36:52 crc kubenswrapper[4926]: I0312 18:36:52.579752 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-z87mn_45ddf7b5-1b73-473c-9da8-c35d9a4e0ddd/manager/0.log" Mar 12 18:36:52 crc kubenswrapper[4926]: I0312 18:36:52.626316 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-25jrt_682f3a0f-7437-455c-99e1-8b7cdb03328a/manager/0.log" Mar 12 18:36:52 crc kubenswrapper[4926]: I0312 18:36:52.839737 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-cn7fq_80f26a3f-ab1e-49b1-8843-6674d948a5cd/manager/0.log" Mar 12 18:36:52 crc kubenswrapper[4926]: I0312 18:36:52.908650 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55976db558-2kgv6_495cec8d-a262-4d55-9ee5-6eebb10b6765/manager/0.log" Mar 12 18:36:54 crc kubenswrapper[4926]: I0312 18:36:54.284789 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-ssmx7_e7b3fba0-ddaa-4cef-9df6-0683a92475cf/manager/0.log" Mar 12 18:37:01 crc kubenswrapper[4926]: I0312 18:37:01.258600 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-bbdc94bc7-pmtmm" podUID="09eb7b3b-c5af-4625-8f1a-83766550711c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 12 18:37:02 crc kubenswrapper[4926]: I0312 18:37:02.490423 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:37:03 crc kubenswrapper[4926]: I0312 18:37:03.029422 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"2201cfc89c392b2d9df343a46350dae7f7620675e753c895f182c00c5eb9467c"} Mar 12 18:37:14 crc kubenswrapper[4926]: I0312 18:37:14.068688 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-sgl9z_5d073c88-4608-4594-9feb-f1093455368d/control-plane-machine-set-operator/0.log" Mar 12 18:37:14 crc kubenswrapper[4926]: I0312 18:37:14.276473 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bfn44_49e5e304-df7c-434b-8b17-f520e9bb7d52/kube-rbac-proxy/0.log" Mar 12 18:37:14 crc kubenswrapper[4926]: I0312 18:37:14.302575 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bfn44_49e5e304-df7c-434b-8b17-f520e9bb7d52/machine-api-operator/0.log" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.516739 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngnmz"] Mar 12 18:37:24 crc kubenswrapper[4926]: E0312 18:37:24.521040 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1b045ea-1fff-43b3-9a66-48dd361e9f33" containerName="oc" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.521205 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1b045ea-1fff-43b3-9a66-48dd361e9f33" containerName="oc" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.521525 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1b045ea-1fff-43b3-9a66-48dd361e9f33" containerName="oc" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.523246 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.535974 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngnmz"] Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.670244 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbv8\" (UniqueName: \"kubernetes.io/projected/305b4cae-bc27-4c5d-8142-042446a753af-kube-api-access-mcbv8\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.670295 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-utilities\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.670399 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-catalog-content\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.772073 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-catalog-content\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.772201 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcbv8\" (UniqueName: \"kubernetes.io/projected/305b4cae-bc27-4c5d-8142-042446a753af-kube-api-access-mcbv8\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.772232 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-utilities\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.772634 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-catalog-content\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.772793 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-utilities\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.810140 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcbv8\" (UniqueName: \"kubernetes.io/projected/305b4cae-bc27-4c5d-8142-042446a753af-kube-api-access-mcbv8\") pod \"redhat-operators-ngnmz\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:24 crc kubenswrapper[4926]: I0312 18:37:24.845873 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:25 crc kubenswrapper[4926]: I0312 18:37:25.316955 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngnmz"] Mar 12 18:37:25 crc kubenswrapper[4926]: W0312 18:37:25.323377 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod305b4cae_bc27_4c5d_8142_042446a753af.slice/crio-27c46b747fc3b972108b73d6610b46b16d2ab44f98be0213b6dafd0b1655a885 WatchSource:0}: Error finding container 27c46b747fc3b972108b73d6610b46b16d2ab44f98be0213b6dafd0b1655a885: Status 404 returned error can't find the container with id 27c46b747fc3b972108b73d6610b46b16d2ab44f98be0213b6dafd0b1655a885 Mar 12 18:37:26 crc kubenswrapper[4926]: I0312 18:37:26.281900 4926 generic.go:334] "Generic (PLEG): container finished" podID="305b4cae-bc27-4c5d-8142-042446a753af" containerID="3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91" exitCode=0 Mar 12 18:37:26 crc kubenswrapper[4926]: I0312 18:37:26.282008 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerDied","Data":"3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91"} Mar 12 18:37:26 crc kubenswrapper[4926]: I0312 18:37:26.282316 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerStarted","Data":"27c46b747fc3b972108b73d6610b46b16d2ab44f98be0213b6dafd0b1655a885"} Mar 12 18:37:27 crc kubenswrapper[4926]: I0312 18:37:27.291477 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerStarted","Data":"45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3"} Mar 12 18:37:27 crc kubenswrapper[4926]: I0312 18:37:27.693693 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7pf7h_04933adf-efe6-4d54-8575-cc5c4069ea9a/cert-manager-controller/0.log" Mar 12 18:37:27 crc kubenswrapper[4926]: I0312 18:37:27.837972 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mbhfc_eaa86db1-fe85-4b00-b8e0-c61cb013f52d/cert-manager-cainjector/0.log" Mar 12 18:37:27 crc kubenswrapper[4926]: I0312 18:37:27.888796 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-p8p7c_ef14eb59-d30a-437c-80d1-70513a544b2d/cert-manager-webhook/0.log" Mar 12 18:37:29 crc kubenswrapper[4926]: I0312 18:37:29.313123 4926 generic.go:334] "Generic (PLEG): container finished" podID="305b4cae-bc27-4c5d-8142-042446a753af" containerID="45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3" exitCode=0 Mar 12 18:37:29 crc kubenswrapper[4926]: I0312 18:37:29.313186 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerDied","Data":"45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3"} Mar 12 18:37:31 crc kubenswrapper[4926]: I0312 18:37:31.349316 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerStarted","Data":"74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f"} Mar 12 18:37:31 crc kubenswrapper[4926]: I0312 18:37:31.368742 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngnmz" podStartSLOduration=3.175810541 podStartE2EDuration="7.368727456s" podCreationTimestamp="2026-03-12 18:37:24 +0000 UTC" firstStartedPulling="2026-03-12 18:37:26.283793936 +0000 UTC m=+2086.652420269" lastFinishedPulling="2026-03-12 18:37:30.476710851 +0000 UTC m=+2090.845337184" observedRunningTime="2026-03-12 18:37:31.368277451 +0000 UTC m=+2091.736903784" watchObservedRunningTime="2026-03-12 18:37:31.368727456 +0000 UTC m=+2091.737353789" Mar 12 18:37:34 crc kubenswrapper[4926]: I0312 18:37:34.846488 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:34 crc kubenswrapper[4926]: I0312 18:37:34.848133 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:37:35 crc kubenswrapper[4926]: I0312 18:37:35.892223 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngnmz" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" probeResult="failure" output=< Mar 12 18:37:35 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:37:35 crc kubenswrapper[4926]: > Mar 12 18:37:41 crc kubenswrapper[4926]: I0312 18:37:41.235951 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-67hvb_9e731d72-a0a8-46a4-af0b-5dce65f29dd1/nmstate-console-plugin/0.log" Mar 12 18:37:41 crc kubenswrapper[4926]: I0312 18:37:41.389465 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-z52rq_9db224ca-f640-4756-9c80-afe7ff63dcbe/nmstate-handler/0.log" Mar 12 18:37:41 crc kubenswrapper[4926]: I0312 18:37:41.411272 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-rn7dj_cf97acff-2650-4e84-ab30-d10f9bd70ef4/kube-rbac-proxy/0.log" Mar 12 18:37:41 crc kubenswrapper[4926]: I0312 18:37:41.582177 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-rn7dj_cf97acff-2650-4e84-ab30-d10f9bd70ef4/nmstate-metrics/0.log" Mar 12 18:37:41 crc kubenswrapper[4926]: I0312 18:37:41.597584 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-wnkdv_7247f6a4-e62d-44e3-b91c-1117fca5c960/nmstate-operator/0.log" Mar 12 18:37:41 crc kubenswrapper[4926]: I0312 18:37:41.765538 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-h5ll8_bdc3504f-bf1b-4b71-aaf2-45e24e41a84e/nmstate-webhook/0.log" Mar 12 18:37:45 crc kubenswrapper[4926]: I0312 18:37:45.892489 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngnmz" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" probeResult="failure" output=< Mar 12 18:37:45 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:37:45 crc kubenswrapper[4926]: > Mar 12 18:37:55 crc kubenswrapper[4926]: I0312 18:37:55.909664 4926 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngnmz" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" probeResult="failure" output=< Mar 12 18:37:55 crc kubenswrapper[4926]: timeout: failed to connect service ":50051" within 1s Mar 12 18:37:55 crc kubenswrapper[4926]: > Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.140567 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555678-f2ldc"] Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.142018 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.143704 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.144075 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.144738 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.151961 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555678-f2ldc"] Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.175268 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgcwh\" (UniqueName: \"kubernetes.io/projected/6e31677c-4037-4cbc-9d34-5487431793e9-kube-api-access-rgcwh\") pod \"auto-csr-approver-29555678-f2ldc\" (UID: \"6e31677c-4037-4cbc-9d34-5487431793e9\") " pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.277625 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgcwh\" (UniqueName: \"kubernetes.io/projected/6e31677c-4037-4cbc-9d34-5487431793e9-kube-api-access-rgcwh\") pod \"auto-csr-approver-29555678-f2ldc\" (UID: \"6e31677c-4037-4cbc-9d34-5487431793e9\") " pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.302378 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgcwh\" (UniqueName: \"kubernetes.io/projected/6e31677c-4037-4cbc-9d34-5487431793e9-kube-api-access-rgcwh\") pod \"auto-csr-approver-29555678-f2ldc\" (UID: \"6e31677c-4037-4cbc-9d34-5487431793e9\") " pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.462450 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:00 crc kubenswrapper[4926]: I0312 18:38:00.999274 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555678-f2ldc"] Mar 12 18:38:01 crc kubenswrapper[4926]: I0312 18:38:01.007479 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:38:01 crc kubenswrapper[4926]: I0312 18:38:01.633293 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" event={"ID":"6e31677c-4037-4cbc-9d34-5487431793e9","Type":"ContainerStarted","Data":"45c44d1933bdfb1f99d54cecba14b4c68a52a137566e9c1b97d4dacb2c549a18"} Mar 12 18:38:02 crc kubenswrapper[4926]: I0312 18:38:02.648989 4926 generic.go:334] "Generic (PLEG): container finished" podID="6e31677c-4037-4cbc-9d34-5487431793e9" containerID="d822df10a89aa630f25e85c9dfeda5e7eb5b0c4e3ca5846540130f6ca7031958" exitCode=0 Mar 12 18:38:02 crc kubenswrapper[4926]: I0312 18:38:02.649197 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" event={"ID":"6e31677c-4037-4cbc-9d34-5487431793e9","Type":"ContainerDied","Data":"d822df10a89aa630f25e85c9dfeda5e7eb5b0c4e3ca5846540130f6ca7031958"} Mar 12 18:38:03 crc kubenswrapper[4926]: I0312 18:38:03.988464 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.047392 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgcwh\" (UniqueName: \"kubernetes.io/projected/6e31677c-4037-4cbc-9d34-5487431793e9-kube-api-access-rgcwh\") pod \"6e31677c-4037-4cbc-9d34-5487431793e9\" (UID: \"6e31677c-4037-4cbc-9d34-5487431793e9\") " Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.053055 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e31677c-4037-4cbc-9d34-5487431793e9-kube-api-access-rgcwh" (OuterVolumeSpecName: "kube-api-access-rgcwh") pod "6e31677c-4037-4cbc-9d34-5487431793e9" (UID: "6e31677c-4037-4cbc-9d34-5487431793e9"). InnerVolumeSpecName "kube-api-access-rgcwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.149250 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgcwh\" (UniqueName: \"kubernetes.io/projected/6e31677c-4037-4cbc-9d34-5487431793e9-kube-api-access-rgcwh\") on node \"crc\" DevicePath \"\"" Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.665167 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" event={"ID":"6e31677c-4037-4cbc-9d34-5487431793e9","Type":"ContainerDied","Data":"45c44d1933bdfb1f99d54cecba14b4c68a52a137566e9c1b97d4dacb2c549a18"} Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.665819 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c44d1933bdfb1f99d54cecba14b4c68a52a137566e9c1b97d4dacb2c549a18" Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.665235 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555678-f2ldc" Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.909187 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:38:04 crc kubenswrapper[4926]: I0312 18:38:04.956780 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:38:05 crc kubenswrapper[4926]: I0312 18:38:05.051552 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555672-dq68h"] Mar 12 18:38:05 crc kubenswrapper[4926]: I0312 18:38:05.058516 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555672-dq68h"] Mar 12 18:38:05 crc kubenswrapper[4926]: I0312 18:38:05.141303 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngnmz"] Mar 12 18:38:06 crc kubenswrapper[4926]: I0312 18:38:06.501759 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609b8dbc-517d-4483-b02a-d7445cd2aa2f" path="/var/lib/kubelet/pods/609b8dbc-517d-4483-b02a-d7445cd2aa2f/volumes" Mar 12 18:38:06 crc kubenswrapper[4926]: I0312 18:38:06.680982 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngnmz" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" containerID="cri-o://74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f" gracePeriod=2 Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.119139 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.220204 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-catalog-content\") pod \"305b4cae-bc27-4c5d-8142-042446a753af\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.220330 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcbv8\" (UniqueName: \"kubernetes.io/projected/305b4cae-bc27-4c5d-8142-042446a753af-kube-api-access-mcbv8\") pod \"305b4cae-bc27-4c5d-8142-042446a753af\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.220471 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-utilities\") pod \"305b4cae-bc27-4c5d-8142-042446a753af\" (UID: \"305b4cae-bc27-4c5d-8142-042446a753af\") " Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.221146 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-utilities" (OuterVolumeSpecName: "utilities") pod "305b4cae-bc27-4c5d-8142-042446a753af" (UID: "305b4cae-bc27-4c5d-8142-042446a753af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.234805 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305b4cae-bc27-4c5d-8142-042446a753af-kube-api-access-mcbv8" (OuterVolumeSpecName: "kube-api-access-mcbv8") pod "305b4cae-bc27-4c5d-8142-042446a753af" (UID: "305b4cae-bc27-4c5d-8142-042446a753af"). InnerVolumeSpecName "kube-api-access-mcbv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.322992 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcbv8\" (UniqueName: \"kubernetes.io/projected/305b4cae-bc27-4c5d-8142-042446a753af-kube-api-access-mcbv8\") on node \"crc\" DevicePath \"\"" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.323042 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.337995 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "305b4cae-bc27-4c5d-8142-042446a753af" (UID: "305b4cae-bc27-4c5d-8142-042446a753af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.424474 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305b4cae-bc27-4c5d-8142-042446a753af-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.694660 4926 generic.go:334] "Generic (PLEG): container finished" podID="305b4cae-bc27-4c5d-8142-042446a753af" containerID="74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f" exitCode=0 Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.694762 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngnmz" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.694789 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerDied","Data":"74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f"} Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.695294 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngnmz" event={"ID":"305b4cae-bc27-4c5d-8142-042446a753af","Type":"ContainerDied","Data":"27c46b747fc3b972108b73d6610b46b16d2ab44f98be0213b6dafd0b1655a885"} Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.695328 4926 scope.go:117] "RemoveContainer" containerID="74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.738753 4926 scope.go:117] "RemoveContainer" containerID="45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.745634 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngnmz"] Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.753941 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngnmz"] Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.766414 4926 scope.go:117] "RemoveContainer" containerID="3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.810013 4926 scope.go:117] "RemoveContainer" containerID="74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f" Mar 12 18:38:07 crc kubenswrapper[4926]: E0312 18:38:07.810571 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f\": container with ID starting with 74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f not found: ID does not exist" containerID="74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.810605 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f"} err="failed to get container status \"74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f\": rpc error: code = NotFound desc = could not find container \"74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f\": container with ID starting with 74ce600f7c985ed38e0c138bcf27081c7bc8d0ed8d001ebf2fdb7df3b4033d0f not found: ID does not exist" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.810631 4926 scope.go:117] "RemoveContainer" containerID="45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3" Mar 12 18:38:07 crc kubenswrapper[4926]: E0312 18:38:07.810960 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3\": container with ID starting with 45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3 not found: ID does not exist" containerID="45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.810990 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3"} err="failed to get container status \"45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3\": rpc error: code = NotFound desc = could not find container \"45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3\": container with ID starting with 45de0ce8f8fe963be3a32d9a7c53fa9cb9f050130db08052d5f428486f761bb3 not found: ID does not exist" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.811010 4926 scope.go:117] "RemoveContainer" containerID="3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91" Mar 12 18:38:07 crc kubenswrapper[4926]: E0312 18:38:07.811466 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91\": container with ID starting with 3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91 not found: ID does not exist" containerID="3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91" Mar 12 18:38:07 crc kubenswrapper[4926]: I0312 18:38:07.811493 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91"} err="failed to get container status \"3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91\": rpc error: code = NotFound desc = could not find container \"3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91\": container with ID starting with 3db3234f981f4766633cb03a8859373ca4c860ed57313cf0bf6a95d3e1aa5e91 not found: ID does not exist" Mar 12 18:38:08 crc kubenswrapper[4926]: I0312 18:38:08.502308 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305b4cae-bc27-4c5d-8142-042446a753af" path="/var/lib/kubelet/pods/305b4cae-bc27-4c5d-8142-042446a753af/volumes" Mar 12 18:38:09 crc kubenswrapper[4926]: I0312 18:38:09.849638 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-58dtw_16ea4f33-38fb-42c7-9c85-67c443f0b3a4/kube-rbac-proxy/0.log" Mar 12 18:38:09 crc kubenswrapper[4926]: I0312 18:38:09.887091 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-58dtw_16ea4f33-38fb-42c7-9c85-67c443f0b3a4/controller/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.102123 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-frr-files/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.295928 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-reloader/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.298334 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-reloader/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.321899 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-metrics/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.390859 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-frr-files/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.495786 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-metrics/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.500236 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-frr-files/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.544116 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-reloader/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.600779 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-metrics/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.710183 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-frr-files/0.log" Mar 12 18:38:10 crc kubenswrapper[4926]: I0312 18:38:10.777833 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-reloader/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.003876 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/controller/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.009495 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/cp-metrics/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.205040 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/frr-metrics/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.205353 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/kube-rbac-proxy/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.232467 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/kube-rbac-proxy-frr/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.451989 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/reloader/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.481116 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-j2n6d_036c2795-2942-4cc8-9a91-6cc48cbe7521/frr-k8s-webhook-server/0.log" Mar 12 18:38:11 crc kubenswrapper[4926]: I0312 18:38:11.801132 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7cf578c5b8-z4gn8_8a847a81-61ae-42e3-9866-c25b68fd77cb/manager/0.log" Mar 12 18:38:12 crc kubenswrapper[4926]: I0312 18:38:12.021683 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79dbc878dc-w7f58_4d6f4326-2022-436b-9523-383aae3fd5cd/webhook-server/0.log" Mar 12 18:38:12 crc kubenswrapper[4926]: I0312 18:38:12.115945 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6bt9q_d10f0ca7-6fc4-4e6a-815c-ad5a1db16350/kube-rbac-proxy/0.log" Mar 12 18:38:12 crc kubenswrapper[4926]: I0312 18:38:12.338086 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9p72p_53da3fff-e3f4-4b9d-a887-f5a28f986107/frr/0.log" Mar 12 18:38:12 crc kubenswrapper[4926]: I0312 18:38:12.555346 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6bt9q_d10f0ca7-6fc4-4e6a-815c-ad5a1db16350/speaker/0.log" Mar 12 18:38:19 crc kubenswrapper[4926]: I0312 18:38:19.372007 4926 scope.go:117] "RemoveContainer" containerID="95bf3d49a01a19202e8eb63fbef45841e7cc791e3f3ccb3272c728981fc71d7d" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.409237 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/util/0.log" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.635898 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/util/0.log" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.646991 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/pull/0.log" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.647110 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/pull/0.log" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.819914 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/pull/0.log" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.856317 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/extract/0.log" Mar 12 18:38:25 crc kubenswrapper[4926]: I0312 18:38:25.865875 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874j5hnq_062d1c31-cb0c-4470-bf74-0fb541319609/util/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.001735 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/util/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.149075 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/util/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.157145 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/pull/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.199143 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/pull/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.397020 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/extract/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.404248 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/util/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.413074 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1pp5dg_a46be1f3-50fc-45b5-a480-98d9763db69d/pull/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.591076 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/extract-utilities/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.741106 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/extract-utilities/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.755966 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/extract-content/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.804292 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/extract-content/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.953100 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/extract-utilities/0.log" Mar 12 18:38:26 crc kubenswrapper[4926]: I0312 18:38:26.997124 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/extract-content/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.146068 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/extract-utilities/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.365068 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-t8lwn_0ef972e3-81c7-4b62-aa07-4939aef86a2d/registry-server/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.422679 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/extract-content/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.454327 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/extract-utilities/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.476668 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/extract-content/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.593529 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/extract-utilities/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.639586 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/extract-content/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.788196 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5k9gj_0817ed2e-acdd-41b8-b210-84281525839f/registry-server/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.856714 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-d9gx5_daeebaaf-6a69-436e-b341-36fae756599e/marketplace-operator/0.log" Mar 12 18:38:27 crc kubenswrapper[4926]: I0312 18:38:27.954122 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/extract-utilities/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.106361 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/extract-utilities/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.135741 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/extract-content/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.147061 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/extract-content/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.351625 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/extract-utilities/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.361498 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/extract-content/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.417350 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-265z5_0f746e64-bce3-4f58-b789-0f5573e28847/registry-server/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.560737 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/extract-utilities/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.689991 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/extract-utilities/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.710518 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/extract-content/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.737130 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/extract-content/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.871735 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/extract-utilities/0.log" Mar 12 18:38:28 crc kubenswrapper[4926]: I0312 18:38:28.908057 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/extract-content/0.log" Mar 12 18:38:29 crc kubenswrapper[4926]: I0312 18:38:29.234911 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-672mz_c9f79c44-e93f-48ba-9f2d-8a5b61a86089/registry-server/0.log" Mar 12 18:39:26 crc kubenswrapper[4926]: I0312 18:39:26.817588 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:39:26 crc kubenswrapper[4926]: I0312 18:39:26.818096 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:39:56 crc kubenswrapper[4926]: I0312 18:39:56.818129 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:39:56 crc kubenswrapper[4926]: I0312 18:39:56.820352 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.153630 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555680-cfpm6"] Mar 12 18:40:00 crc kubenswrapper[4926]: E0312 18:40:00.154649 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.154665 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" Mar 12 18:40:00 crc kubenswrapper[4926]: E0312 18:40:00.154882 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e31677c-4037-4cbc-9d34-5487431793e9" containerName="oc" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.154890 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e31677c-4037-4cbc-9d34-5487431793e9" containerName="oc" Mar 12 18:40:00 crc kubenswrapper[4926]: E0312 18:40:00.154915 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="extract-content" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.154923 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="extract-content" Mar 12 18:40:00 crc kubenswrapper[4926]: E0312 18:40:00.154941 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="extract-utilities" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.154949 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="extract-utilities" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.155136 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e31677c-4037-4cbc-9d34-5487431793e9" containerName="oc" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.155159 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="305b4cae-bc27-4c5d-8142-042446a753af" containerName="registry-server" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.156007 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.161073 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.161363 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.161594 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.166092 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555680-cfpm6"] Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.218190 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m722\" (UniqueName: \"kubernetes.io/projected/d3cd40c1-ffde-4c68-b932-d84c58ec4364-kube-api-access-2m722\") pod \"auto-csr-approver-29555680-cfpm6\" (UID: \"d3cd40c1-ffde-4c68-b932-d84c58ec4364\") " pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.320939 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m722\" (UniqueName: \"kubernetes.io/projected/d3cd40c1-ffde-4c68-b932-d84c58ec4364-kube-api-access-2m722\") pod \"auto-csr-approver-29555680-cfpm6\" (UID: \"d3cd40c1-ffde-4c68-b932-d84c58ec4364\") " pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.348758 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m722\" (UniqueName: \"kubernetes.io/projected/d3cd40c1-ffde-4c68-b932-d84c58ec4364-kube-api-access-2m722\") pod \"auto-csr-approver-29555680-cfpm6\" (UID: \"d3cd40c1-ffde-4c68-b932-d84c58ec4364\") " pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.475470 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:00 crc kubenswrapper[4926]: I0312 18:40:00.985186 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555680-cfpm6"] Mar 12 18:40:01 crc kubenswrapper[4926]: I0312 18:40:01.790258 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" event={"ID":"d3cd40c1-ffde-4c68-b932-d84c58ec4364","Type":"ContainerStarted","Data":"300c905d81cac499e9f45131e33911c3e5941a151d34623a2565c4b79bcdc657"} Mar 12 18:40:03 crc kubenswrapper[4926]: I0312 18:40:03.811921 4926 generic.go:334] "Generic (PLEG): container finished" podID="d3cd40c1-ffde-4c68-b932-d84c58ec4364" containerID="ed5e9b74d75807b6ccd7ef1a2735ad64a65a226a6e6c26e9d80b7971ca9a6cd3" exitCode=0 Mar 12 18:40:03 crc kubenswrapper[4926]: I0312 18:40:03.812199 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" event={"ID":"d3cd40c1-ffde-4c68-b932-d84c58ec4364","Type":"ContainerDied","Data":"ed5e9b74d75807b6ccd7ef1a2735ad64a65a226a6e6c26e9d80b7971ca9a6cd3"} Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.243048 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.337593 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m722\" (UniqueName: \"kubernetes.io/projected/d3cd40c1-ffde-4c68-b932-d84c58ec4364-kube-api-access-2m722\") pod \"d3cd40c1-ffde-4c68-b932-d84c58ec4364\" (UID: \"d3cd40c1-ffde-4c68-b932-d84c58ec4364\") " Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.343728 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3cd40c1-ffde-4c68-b932-d84c58ec4364-kube-api-access-2m722" (OuterVolumeSpecName: "kube-api-access-2m722") pod "d3cd40c1-ffde-4c68-b932-d84c58ec4364" (UID: "d3cd40c1-ffde-4c68-b932-d84c58ec4364"). InnerVolumeSpecName "kube-api-access-2m722". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.441293 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m722\" (UniqueName: \"kubernetes.io/projected/d3cd40c1-ffde-4c68-b932-d84c58ec4364-kube-api-access-2m722\") on node \"crc\" DevicePath \"\"" Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.835497 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" event={"ID":"d3cd40c1-ffde-4c68-b932-d84c58ec4364","Type":"ContainerDied","Data":"300c905d81cac499e9f45131e33911c3e5941a151d34623a2565c4b79bcdc657"} Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.835569 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300c905d81cac499e9f45131e33911c3e5941a151d34623a2565c4b79bcdc657" Mar 12 18:40:05 crc kubenswrapper[4926]: I0312 18:40:05.835627 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555680-cfpm6" Mar 12 18:40:06 crc kubenswrapper[4926]: I0312 18:40:06.334944 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555674-2b7bv"] Mar 12 18:40:06 crc kubenswrapper[4926]: I0312 18:40:06.342651 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555674-2b7bv"] Mar 12 18:40:06 crc kubenswrapper[4926]: I0312 18:40:06.507906 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="540a72de-64b2-41af-9caf-95895f11cb79" path="/var/lib/kubelet/pods/540a72de-64b2-41af-9caf-95895f11cb79/volumes" Mar 12 18:40:11 crc kubenswrapper[4926]: I0312 18:40:11.906584 4926 generic.go:334] "Generic (PLEG): container finished" podID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerID="4dbea05e4f0fb083c128a04cb1592adbe2b7259461b2a690428955b1ed545c5d" exitCode=0 Mar 12 18:40:11 crc kubenswrapper[4926]: I0312 18:40:11.906728 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" event={"ID":"2562de93-d8c3-4055-8ab2-3c55b4f3c830","Type":"ContainerDied","Data":"4dbea05e4f0fb083c128a04cb1592adbe2b7259461b2a690428955b1ed545c5d"} Mar 12 18:40:11 crc kubenswrapper[4926]: I0312 18:40:11.908620 4926 scope.go:117] "RemoveContainer" containerID="4dbea05e4f0fb083c128a04cb1592adbe2b7259461b2a690428955b1ed545c5d" Mar 12 18:40:12 crc kubenswrapper[4926]: I0312 18:40:12.068678 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nz8z6_must-gather-x4gfw_2562de93-d8c3-4055-8ab2-3c55b4f3c830/gather/0.log" Mar 12 18:40:19 crc kubenswrapper[4926]: I0312 18:40:19.495482 4926 scope.go:117] "RemoveContainer" containerID="804c89c001a28f1eae59ba1c1cb327e0cc2c54749f5a07cd9cd07164cc1b86c4" Mar 12 18:40:19 crc kubenswrapper[4926]: I0312 18:40:19.677489 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-nz8z6/must-gather-x4gfw"] Mar 12 18:40:19 crc kubenswrapper[4926]: I0312 18:40:19.678075 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="copy" containerID="cri-o://9983eed33b5073c82867ffe643c25901fb02264d34aa9779308efc3e4497884a" gracePeriod=2 Mar 12 18:40:19 crc kubenswrapper[4926]: I0312 18:40:19.687102 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-nz8z6/must-gather-x4gfw"] Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.003088 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nz8z6_must-gather-x4gfw_2562de93-d8c3-4055-8ab2-3c55b4f3c830/copy/0.log" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.003552 4926 generic.go:334] "Generic (PLEG): container finished" podID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerID="9983eed33b5073c82867ffe643c25901fb02264d34aa9779308efc3e4497884a" exitCode=143 Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.210251 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nz8z6_must-gather-x4gfw_2562de93-d8c3-4055-8ab2-3c55b4f3c830/copy/0.log" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.211005 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.299117 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w44tw\" (UniqueName: \"kubernetes.io/projected/2562de93-d8c3-4055-8ab2-3c55b4f3c830-kube-api-access-w44tw\") pod \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.299232 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2562de93-d8c3-4055-8ab2-3c55b4f3c830-must-gather-output\") pod \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\" (UID: \"2562de93-d8c3-4055-8ab2-3c55b4f3c830\") " Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.310735 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2562de93-d8c3-4055-8ab2-3c55b4f3c830-kube-api-access-w44tw" (OuterVolumeSpecName: "kube-api-access-w44tw") pod "2562de93-d8c3-4055-8ab2-3c55b4f3c830" (UID: "2562de93-d8c3-4055-8ab2-3c55b4f3c830"). InnerVolumeSpecName "kube-api-access-w44tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.401619 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w44tw\" (UniqueName: \"kubernetes.io/projected/2562de93-d8c3-4055-8ab2-3c55b4f3c830-kube-api-access-w44tw\") on node \"crc\" DevicePath \"\"" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.458612 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2562de93-d8c3-4055-8ab2-3c55b4f3c830-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2562de93-d8c3-4055-8ab2-3c55b4f3c830" (UID: "2562de93-d8c3-4055-8ab2-3c55b4f3c830"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.500733 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" path="/var/lib/kubelet/pods/2562de93-d8c3-4055-8ab2-3c55b4f3c830/volumes" Mar 12 18:40:20 crc kubenswrapper[4926]: I0312 18:40:20.502573 4926 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2562de93-d8c3-4055-8ab2-3c55b4f3c830-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 12 18:40:21 crc kubenswrapper[4926]: I0312 18:40:21.011905 4926 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-nz8z6_must-gather-x4gfw_2562de93-d8c3-4055-8ab2-3c55b4f3c830/copy/0.log" Mar 12 18:40:21 crc kubenswrapper[4926]: I0312 18:40:21.012215 4926 scope.go:117] "RemoveContainer" containerID="9983eed33b5073c82867ffe643c25901fb02264d34aa9779308efc3e4497884a" Mar 12 18:40:21 crc kubenswrapper[4926]: I0312 18:40:21.012278 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nz8z6/must-gather-x4gfw" Mar 12 18:40:21 crc kubenswrapper[4926]: I0312 18:40:21.031483 4926 scope.go:117] "RemoveContainer" containerID="4dbea05e4f0fb083c128a04cb1592adbe2b7259461b2a690428955b1ed545c5d" Mar 12 18:40:26 crc kubenswrapper[4926]: I0312 18:40:26.817593 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:40:26 crc kubenswrapper[4926]: I0312 18:40:26.818183 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:40:26 crc kubenswrapper[4926]: I0312 18:40:26.818248 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:40:26 crc kubenswrapper[4926]: I0312 18:40:26.819258 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2201cfc89c392b2d9df343a46350dae7f7620675e753c895f182c00c5eb9467c"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:40:26 crc kubenswrapper[4926]: I0312 18:40:26.819323 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://2201cfc89c392b2d9df343a46350dae7f7620675e753c895f182c00c5eb9467c" gracePeriod=600 Mar 12 18:40:27 crc kubenswrapper[4926]: I0312 18:40:27.090710 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="2201cfc89c392b2d9df343a46350dae7f7620675e753c895f182c00c5eb9467c" exitCode=0 Mar 12 18:40:27 crc kubenswrapper[4926]: I0312 18:40:27.091098 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"2201cfc89c392b2d9df343a46350dae7f7620675e753c895f182c00c5eb9467c"} Mar 12 18:40:27 crc kubenswrapper[4926]: I0312 18:40:27.091146 4926 scope.go:117] "RemoveContainer" containerID="842a75b054aae388d59ac83e483a69d941997a23cf47d9012a53fea65a005b5e" Mar 12 18:40:28 crc kubenswrapper[4926]: I0312 18:40:28.099924 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerStarted","Data":"79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2"} Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.976616 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4fx77"] Mar 12 18:40:58 crc kubenswrapper[4926]: E0312 18:40:58.984043 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="copy" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.984072 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="copy" Mar 12 18:40:58 crc kubenswrapper[4926]: E0312 18:40:58.984092 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cd40c1-ffde-4c68-b932-d84c58ec4364" containerName="oc" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.984102 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cd40c1-ffde-4c68-b932-d84c58ec4364" containerName="oc" Mar 12 18:40:58 crc kubenswrapper[4926]: E0312 18:40:58.984118 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="gather" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.984125 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="gather" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.984333 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="copy" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.984348 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2562de93-d8c3-4055-8ab2-3c55b4f3c830" containerName="gather" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.984358 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cd40c1-ffde-4c68-b932-d84c58ec4364" containerName="oc" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.986072 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:58 crc kubenswrapper[4926]: I0312 18:40:58.998502 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4fx77"] Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.036626 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-utilities\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.037082 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtsw\" (UniqueName: \"kubernetes.io/projected/d7e47942-3557-4107-a493-c18e4d174d1e-kube-api-access-bbtsw\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.037270 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-catalog-content\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.139610 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-utilities\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.139767 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtsw\" (UniqueName: \"kubernetes.io/projected/d7e47942-3557-4107-a493-c18e4d174d1e-kube-api-access-bbtsw\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.139840 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-catalog-content\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.140181 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-utilities\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.140255 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-catalog-content\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.165532 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtsw\" (UniqueName: \"kubernetes.io/projected/d7e47942-3557-4107-a493-c18e4d174d1e-kube-api-access-bbtsw\") pod \"certified-operators-4fx77\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.311651 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:40:59 crc kubenswrapper[4926]: I0312 18:40:59.786780 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4fx77"] Mar 12 18:41:00 crc kubenswrapper[4926]: I0312 18:41:00.424319 4926 generic.go:334] "Generic (PLEG): container finished" podID="d7e47942-3557-4107-a493-c18e4d174d1e" containerID="5571f3a9f466caaa9333a8b56f85c61cee8d3b7a08a46b6c991362072f21c04e" exitCode=0 Mar 12 18:41:00 crc kubenswrapper[4926]: I0312 18:41:00.424383 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerDied","Data":"5571f3a9f466caaa9333a8b56f85c61cee8d3b7a08a46b6c991362072f21c04e"} Mar 12 18:41:00 crc kubenswrapper[4926]: I0312 18:41:00.424719 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerStarted","Data":"81fcc693eefa118dd047e3cba7618229952a64d7026c69e5f1def03ed6e8bd7d"} Mar 12 18:41:01 crc kubenswrapper[4926]: I0312 18:41:01.435728 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerStarted","Data":"a16e7dd57edf66ac5ce9d3d0f07997272a6bc5cd63023c443aa9cd54a9bf9d1d"} Mar 12 18:41:02 crc kubenswrapper[4926]: I0312 18:41:02.447113 4926 generic.go:334] "Generic (PLEG): container finished" podID="d7e47942-3557-4107-a493-c18e4d174d1e" containerID="a16e7dd57edf66ac5ce9d3d0f07997272a6bc5cd63023c443aa9cd54a9bf9d1d" exitCode=0 Mar 12 18:41:02 crc kubenswrapper[4926]: I0312 18:41:02.447192 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerDied","Data":"a16e7dd57edf66ac5ce9d3d0f07997272a6bc5cd63023c443aa9cd54a9bf9d1d"} Mar 12 18:41:04 crc kubenswrapper[4926]: I0312 18:41:04.474874 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerStarted","Data":"f9735ccc51d66ef1ac0075c04afb2fe41d5c57065507fc42e109f6d09e13bd75"} Mar 12 18:41:04 crc kubenswrapper[4926]: I0312 18:41:04.508722 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4fx77" podStartSLOduration=3.469928134 podStartE2EDuration="6.508704734s" podCreationTimestamp="2026-03-12 18:40:58 +0000 UTC" firstStartedPulling="2026-03-12 18:41:00.426934449 +0000 UTC m=+2300.795560812" lastFinishedPulling="2026-03-12 18:41:03.465711069 +0000 UTC m=+2303.834337412" observedRunningTime="2026-03-12 18:41:04.502669952 +0000 UTC m=+2304.871296295" watchObservedRunningTime="2026-03-12 18:41:04.508704734 +0000 UTC m=+2304.877331077" Mar 12 18:41:09 crc kubenswrapper[4926]: I0312 18:41:09.311800 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:41:09 crc kubenswrapper[4926]: I0312 18:41:09.312325 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:41:09 crc kubenswrapper[4926]: I0312 18:41:09.363061 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:41:09 crc kubenswrapper[4926]: I0312 18:41:09.579792 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:41:09 crc kubenswrapper[4926]: I0312 18:41:09.624253 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4fx77"] Mar 12 18:41:11 crc kubenswrapper[4926]: I0312 18:41:11.553332 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4fx77" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="registry-server" containerID="cri-o://f9735ccc51d66ef1ac0075c04afb2fe41d5c57065507fc42e109f6d09e13bd75" gracePeriod=2 Mar 12 18:41:12 crc kubenswrapper[4926]: I0312 18:41:12.567469 4926 generic.go:334] "Generic (PLEG): container finished" podID="d7e47942-3557-4107-a493-c18e4d174d1e" containerID="f9735ccc51d66ef1ac0075c04afb2fe41d5c57065507fc42e109f6d09e13bd75" exitCode=0 Mar 12 18:41:12 crc kubenswrapper[4926]: I0312 18:41:12.567524 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerDied","Data":"f9735ccc51d66ef1ac0075c04afb2fe41d5c57065507fc42e109f6d09e13bd75"} Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.393508 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.530567 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbtsw\" (UniqueName: \"kubernetes.io/projected/d7e47942-3557-4107-a493-c18e4d174d1e-kube-api-access-bbtsw\") pod \"d7e47942-3557-4107-a493-c18e4d174d1e\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.530665 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-utilities\") pod \"d7e47942-3557-4107-a493-c18e4d174d1e\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.530734 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-catalog-content\") pod \"d7e47942-3557-4107-a493-c18e4d174d1e\" (UID: \"d7e47942-3557-4107-a493-c18e4d174d1e\") " Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.532855 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-utilities" (OuterVolumeSpecName: "utilities") pod "d7e47942-3557-4107-a493-c18e4d174d1e" (UID: "d7e47942-3557-4107-a493-c18e4d174d1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.540192 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e47942-3557-4107-a493-c18e4d174d1e-kube-api-access-bbtsw" (OuterVolumeSpecName: "kube-api-access-bbtsw") pod "d7e47942-3557-4107-a493-c18e4d174d1e" (UID: "d7e47942-3557-4107-a493-c18e4d174d1e"). InnerVolumeSpecName "kube-api-access-bbtsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.583669 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4fx77" event={"ID":"d7e47942-3557-4107-a493-c18e4d174d1e","Type":"ContainerDied","Data":"81fcc693eefa118dd047e3cba7618229952a64d7026c69e5f1def03ed6e8bd7d"} Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.583731 4926 scope.go:117] "RemoveContainer" containerID="f9735ccc51d66ef1ac0075c04afb2fe41d5c57065507fc42e109f6d09e13bd75" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.583731 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4fx77" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.595175 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7e47942-3557-4107-a493-c18e4d174d1e" (UID: "d7e47942-3557-4107-a493-c18e4d174d1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.632785 4926 scope.go:117] "RemoveContainer" containerID="a16e7dd57edf66ac5ce9d3d0f07997272a6bc5cd63023c443aa9cd54a9bf9d1d" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.633370 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.633405 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e47942-3557-4107-a493-c18e4d174d1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.633420 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbtsw\" (UniqueName: \"kubernetes.io/projected/d7e47942-3557-4107-a493-c18e4d174d1e-kube-api-access-bbtsw\") on node \"crc\" DevicePath \"\"" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.661380 4926 scope.go:117] "RemoveContainer" containerID="5571f3a9f466caaa9333a8b56f85c61cee8d3b7a08a46b6c991362072f21c04e" Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.929022 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4fx77"] Mar 12 18:41:13 crc kubenswrapper[4926]: I0312 18:41:13.939372 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4fx77"] Mar 12 18:41:14 crc kubenswrapper[4926]: I0312 18:41:14.500595 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" path="/var/lib/kubelet/pods/d7e47942-3557-4107-a493-c18e4d174d1e/volumes" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.142411 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-822np"] Mar 12 18:41:20 crc kubenswrapper[4926]: E0312 18:41:20.143352 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="extract-utilities" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.143369 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="extract-utilities" Mar 12 18:41:20 crc kubenswrapper[4926]: E0312 18:41:20.143390 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="registry-server" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.143399 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="registry-server" Mar 12 18:41:20 crc kubenswrapper[4926]: E0312 18:41:20.143418 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="extract-content" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.143429 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="extract-content" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.143706 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e47942-3557-4107-a493-c18e4d174d1e" containerName="registry-server" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.145299 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.161800 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-822np"] Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.265707 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-utilities\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.266309 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgm8j\" (UniqueName: \"kubernetes.io/projected/bd48d2ff-4171-4a1a-a3f7-d117afc68847-kube-api-access-zgm8j\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.266351 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-catalog-content\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.368575 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgm8j\" (UniqueName: \"kubernetes.io/projected/bd48d2ff-4171-4a1a-a3f7-d117afc68847-kube-api-access-zgm8j\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.368633 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-catalog-content\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.368768 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-utilities\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.369193 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-utilities\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.369401 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-catalog-content\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.395960 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgm8j\" (UniqueName: \"kubernetes.io/projected/bd48d2ff-4171-4a1a-a3f7-d117afc68847-kube-api-access-zgm8j\") pod \"redhat-marketplace-822np\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.479303 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:20 crc kubenswrapper[4926]: I0312 18:41:20.923523 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-822np"] Mar 12 18:41:21 crc kubenswrapper[4926]: I0312 18:41:21.666189 4926 generic.go:334] "Generic (PLEG): container finished" podID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerID="254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab" exitCode=0 Mar 12 18:41:21 crc kubenswrapper[4926]: I0312 18:41:21.666513 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-822np" event={"ID":"bd48d2ff-4171-4a1a-a3f7-d117afc68847","Type":"ContainerDied","Data":"254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab"} Mar 12 18:41:21 crc kubenswrapper[4926]: I0312 18:41:21.666609 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-822np" event={"ID":"bd48d2ff-4171-4a1a-a3f7-d117afc68847","Type":"ContainerStarted","Data":"be102d0aa90eeef589f542e7c63da5e76a87f4a8ad4b9fbaa525d5b4bde3a687"} Mar 12 18:41:22 crc kubenswrapper[4926]: I0312 18:41:22.688468 4926 generic.go:334] "Generic (PLEG): container finished" podID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerID="987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0" exitCode=0 Mar 12 18:41:22 crc kubenswrapper[4926]: I0312 18:41:22.688581 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-822np" event={"ID":"bd48d2ff-4171-4a1a-a3f7-d117afc68847","Type":"ContainerDied","Data":"987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0"} Mar 12 18:41:23 crc kubenswrapper[4926]: I0312 18:41:23.702299 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-822np" event={"ID":"bd48d2ff-4171-4a1a-a3f7-d117afc68847","Type":"ContainerStarted","Data":"9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea"} Mar 12 18:41:23 crc kubenswrapper[4926]: I0312 18:41:23.740404 4926 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-822np" podStartSLOduration=2.277799989 podStartE2EDuration="3.740383244s" podCreationTimestamp="2026-03-12 18:41:20 +0000 UTC" firstStartedPulling="2026-03-12 18:41:21.672584631 +0000 UTC m=+2322.041211004" lastFinishedPulling="2026-03-12 18:41:23.135167916 +0000 UTC m=+2323.503794259" observedRunningTime="2026-03-12 18:41:23.734658262 +0000 UTC m=+2324.103284645" watchObservedRunningTime="2026-03-12 18:41:23.740383244 +0000 UTC m=+2324.109009587" Mar 12 18:41:30 crc kubenswrapper[4926]: I0312 18:41:30.480314 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:30 crc kubenswrapper[4926]: I0312 18:41:30.480920 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:30 crc kubenswrapper[4926]: I0312 18:41:30.537324 4926 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:30 crc kubenswrapper[4926]: I0312 18:41:30.823812 4926 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:30 crc kubenswrapper[4926]: I0312 18:41:30.880330 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-822np"] Mar 12 18:41:32 crc kubenswrapper[4926]: I0312 18:41:32.782028 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-822np" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="registry-server" containerID="cri-o://9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea" gracePeriod=2 Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.306949 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.442754 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgm8j\" (UniqueName: \"kubernetes.io/projected/bd48d2ff-4171-4a1a-a3f7-d117afc68847-kube-api-access-zgm8j\") pod \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.442876 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-catalog-content\") pod \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.442907 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-utilities\") pod \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\" (UID: \"bd48d2ff-4171-4a1a-a3f7-d117afc68847\") " Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.444655 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-utilities" (OuterVolumeSpecName: "utilities") pod "bd48d2ff-4171-4a1a-a3f7-d117afc68847" (UID: "bd48d2ff-4171-4a1a-a3f7-d117afc68847"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.468362 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd48d2ff-4171-4a1a-a3f7-d117afc68847-kube-api-access-zgm8j" (OuterVolumeSpecName: "kube-api-access-zgm8j") pod "bd48d2ff-4171-4a1a-a3f7-d117afc68847" (UID: "bd48d2ff-4171-4a1a-a3f7-d117afc68847"). InnerVolumeSpecName "kube-api-access-zgm8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.485240 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd48d2ff-4171-4a1a-a3f7-d117afc68847" (UID: "bd48d2ff-4171-4a1a-a3f7-d117afc68847"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.545499 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgm8j\" (UniqueName: \"kubernetes.io/projected/bd48d2ff-4171-4a1a-a3f7-d117afc68847-kube-api-access-zgm8j\") on node \"crc\" DevicePath \"\"" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.545529 4926 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.545539 4926 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd48d2ff-4171-4a1a-a3f7-d117afc68847-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.791632 4926 generic.go:334] "Generic (PLEG): container finished" podID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerID="9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea" exitCode=0 Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.791673 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-822np" event={"ID":"bd48d2ff-4171-4a1a-a3f7-d117afc68847","Type":"ContainerDied","Data":"9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea"} Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.791704 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-822np" event={"ID":"bd48d2ff-4171-4a1a-a3f7-d117afc68847","Type":"ContainerDied","Data":"be102d0aa90eeef589f542e7c63da5e76a87f4a8ad4b9fbaa525d5b4bde3a687"} Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.791720 4926 scope.go:117] "RemoveContainer" containerID="9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.791841 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-822np" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.816053 4926 scope.go:117] "RemoveContainer" containerID="987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.830682 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-822np"] Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.837999 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-822np"] Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.849288 4926 scope.go:117] "RemoveContainer" containerID="254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.905020 4926 scope.go:117] "RemoveContainer" containerID="9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea" Mar 12 18:41:33 crc kubenswrapper[4926]: E0312 18:41:33.905576 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea\": container with ID starting with 9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea not found: ID does not exist" containerID="9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.905630 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea"} err="failed to get container status \"9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea\": rpc error: code = NotFound desc = could not find container \"9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea\": container with ID starting with 9f46a3f2958fc6e08484fb487adaaf277fd41172828d236bc496aa7dc29cf0ea not found: ID does not exist" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.905665 4926 scope.go:117] "RemoveContainer" containerID="987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0" Mar 12 18:41:33 crc kubenswrapper[4926]: E0312 18:41:33.905969 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0\": container with ID starting with 987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0 not found: ID does not exist" containerID="987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.905990 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0"} err="failed to get container status \"987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0\": rpc error: code = NotFound desc = could not find container \"987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0\": container with ID starting with 987d4bae5548023fd3225ac0b17dd33eced3dc2e32196319e0e0ab5cbc06f4b0 not found: ID does not exist" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.906004 4926 scope.go:117] "RemoveContainer" containerID="254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab" Mar 12 18:41:33 crc kubenswrapper[4926]: E0312 18:41:33.906267 4926 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab\": container with ID starting with 254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab not found: ID does not exist" containerID="254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab" Mar 12 18:41:33 crc kubenswrapper[4926]: I0312 18:41:33.906292 4926 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab"} err="failed to get container status \"254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab\": rpc error: code = NotFound desc = could not find container \"254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab\": container with ID starting with 254ef0ca2b3871aa0700b6c4457562c54faac494d6dcd4ad0aa5d755780cd7ab not found: ID does not exist" Mar 12 18:41:34 crc kubenswrapper[4926]: I0312 18:41:34.504524 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" path="/var/lib/kubelet/pods/bd48d2ff-4171-4a1a-a3f7-d117afc68847/volumes" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.154201 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555682-rsgxm"] Mar 12 18:42:00 crc kubenswrapper[4926]: E0312 18:42:00.155150 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="extract-content" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.155165 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="extract-content" Mar 12 18:42:00 crc kubenswrapper[4926]: E0312 18:42:00.155193 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="registry-server" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.155201 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="registry-server" Mar 12 18:42:00 crc kubenswrapper[4926]: E0312 18:42:00.155223 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="extract-utilities" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.155233 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="extract-utilities" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.155460 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd48d2ff-4171-4a1a-a3f7-d117afc68847" containerName="registry-server" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.164867 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.168524 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.169063 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.169215 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.177185 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555682-rsgxm"] Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.192836 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x268q\" (UniqueName: \"kubernetes.io/projected/2955f1b9-7c06-4681-b024-65ce46309648-kube-api-access-x268q\") pod \"auto-csr-approver-29555682-rsgxm\" (UID: \"2955f1b9-7c06-4681-b024-65ce46309648\") " pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.294546 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x268q\" (UniqueName: \"kubernetes.io/projected/2955f1b9-7c06-4681-b024-65ce46309648-kube-api-access-x268q\") pod \"auto-csr-approver-29555682-rsgxm\" (UID: \"2955f1b9-7c06-4681-b024-65ce46309648\") " pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.314386 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x268q\" (UniqueName: \"kubernetes.io/projected/2955f1b9-7c06-4681-b024-65ce46309648-kube-api-access-x268q\") pod \"auto-csr-approver-29555682-rsgxm\" (UID: \"2955f1b9-7c06-4681-b024-65ce46309648\") " pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.491096 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:00 crc kubenswrapper[4926]: I0312 18:42:00.942264 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555682-rsgxm"] Mar 12 18:42:00 crc kubenswrapper[4926]: W0312 18:42:00.948398 4926 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2955f1b9_7c06_4681_b024_65ce46309648.slice/crio-b2956325606c98cf949c53ea15ba2eef378c1602cb590b8b636988f46dd4d2e9 WatchSource:0}: Error finding container b2956325606c98cf949c53ea15ba2eef378c1602cb590b8b636988f46dd4d2e9: Status 404 returned error can't find the container with id b2956325606c98cf949c53ea15ba2eef378c1602cb590b8b636988f46dd4d2e9 Mar 12 18:42:01 crc kubenswrapper[4926]: I0312 18:42:01.100763 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" event={"ID":"2955f1b9-7c06-4681-b024-65ce46309648","Type":"ContainerStarted","Data":"b2956325606c98cf949c53ea15ba2eef378c1602cb590b8b636988f46dd4d2e9"} Mar 12 18:42:03 crc kubenswrapper[4926]: I0312 18:42:03.119854 4926 generic.go:334] "Generic (PLEG): container finished" podID="2955f1b9-7c06-4681-b024-65ce46309648" containerID="ae4090da1d7102f7135d176cea5cebb506f68ec5d1e9004e09e2c6fdc15528d7" exitCode=0 Mar 12 18:42:03 crc kubenswrapper[4926]: I0312 18:42:03.120039 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" event={"ID":"2955f1b9-7c06-4681-b024-65ce46309648","Type":"ContainerDied","Data":"ae4090da1d7102f7135d176cea5cebb506f68ec5d1e9004e09e2c6fdc15528d7"} Mar 12 18:42:04 crc kubenswrapper[4926]: I0312 18:42:04.520212 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:04 crc kubenswrapper[4926]: I0312 18:42:04.696170 4926 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x268q\" (UniqueName: \"kubernetes.io/projected/2955f1b9-7c06-4681-b024-65ce46309648-kube-api-access-x268q\") pod \"2955f1b9-7c06-4681-b024-65ce46309648\" (UID: \"2955f1b9-7c06-4681-b024-65ce46309648\") " Mar 12 18:42:04 crc kubenswrapper[4926]: I0312 18:42:04.701832 4926 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2955f1b9-7c06-4681-b024-65ce46309648-kube-api-access-x268q" (OuterVolumeSpecName: "kube-api-access-x268q") pod "2955f1b9-7c06-4681-b024-65ce46309648" (UID: "2955f1b9-7c06-4681-b024-65ce46309648"). InnerVolumeSpecName "kube-api-access-x268q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 18:42:04 crc kubenswrapper[4926]: I0312 18:42:04.798752 4926 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x268q\" (UniqueName: \"kubernetes.io/projected/2955f1b9-7c06-4681-b024-65ce46309648-kube-api-access-x268q\") on node \"crc\" DevicePath \"\"" Mar 12 18:42:05 crc kubenswrapper[4926]: I0312 18:42:05.145234 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" event={"ID":"2955f1b9-7c06-4681-b024-65ce46309648","Type":"ContainerDied","Data":"b2956325606c98cf949c53ea15ba2eef378c1602cb590b8b636988f46dd4d2e9"} Mar 12 18:42:05 crc kubenswrapper[4926]: I0312 18:42:05.145285 4926 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2956325606c98cf949c53ea15ba2eef378c1602cb590b8b636988f46dd4d2e9" Mar 12 18:42:05 crc kubenswrapper[4926]: I0312 18:42:05.145347 4926 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555682-rsgxm" Mar 12 18:42:05 crc kubenswrapper[4926]: I0312 18:42:05.603738 4926 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555676-5d5p5"] Mar 12 18:42:05 crc kubenswrapper[4926]: I0312 18:42:05.612035 4926 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555676-5d5p5"] Mar 12 18:42:06 crc kubenswrapper[4926]: I0312 18:42:06.509624 4926 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1b045ea-1fff-43b3-9a66-48dd361e9f33" path="/var/lib/kubelet/pods/f1b045ea-1fff-43b3-9a66-48dd361e9f33/volumes" Mar 12 18:42:19 crc kubenswrapper[4926]: I0312 18:42:19.688077 4926 scope.go:117] "RemoveContainer" containerID="cfe5fa51265434fa1414bed949d3e915d15d11d193d0cf363993fcfab40b91d4" Mar 12 18:42:56 crc kubenswrapper[4926]: I0312 18:42:56.818186 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:42:56 crc kubenswrapper[4926]: I0312 18:42:56.818780 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:43:26 crc kubenswrapper[4926]: I0312 18:43:26.818079 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:43:26 crc kubenswrapper[4926]: I0312 18:43:26.818746 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:43:56 crc kubenswrapper[4926]: I0312 18:43:56.818015 4926 patch_prober.go:28] interesting pod/machine-config-daemon-hmdg8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 18:43:56 crc kubenswrapper[4926]: I0312 18:43:56.818538 4926 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 18:43:56 crc kubenswrapper[4926]: I0312 18:43:56.818578 4926 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" Mar 12 18:43:56 crc kubenswrapper[4926]: I0312 18:43:56.819045 4926 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2"} pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 18:43:56 crc kubenswrapper[4926]: I0312 18:43:56.819090 4926 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerName="machine-config-daemon" containerID="cri-o://79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2" gracePeriod=600 Mar 12 18:43:56 crc kubenswrapper[4926]: E0312 18:43:56.957582 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:43:57 crc kubenswrapper[4926]: I0312 18:43:57.360858 4926 generic.go:334] "Generic (PLEG): container finished" podID="f7b34559-da2f-4796-8f3f-c56b2725c464" containerID="79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2" exitCode=0 Mar 12 18:43:57 crc kubenswrapper[4926]: I0312 18:43:57.360903 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" event={"ID":"f7b34559-da2f-4796-8f3f-c56b2725c464","Type":"ContainerDied","Data":"79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2"} Mar 12 18:43:57 crc kubenswrapper[4926]: I0312 18:43:57.360960 4926 scope.go:117] "RemoveContainer" containerID="2201cfc89c392b2d9df343a46350dae7f7620675e753c895f182c00c5eb9467c" Mar 12 18:43:57 crc kubenswrapper[4926]: I0312 18:43:57.361787 4926 scope.go:117] "RemoveContainer" containerID="79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2" Mar 12 18:43:57 crc kubenswrapper[4926]: E0312 18:43:57.362270 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.150468 4926 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555684-njhqk"] Mar 12 18:44:00 crc kubenswrapper[4926]: E0312 18:44:00.151347 4926 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2955f1b9-7c06-4681-b024-65ce46309648" containerName="oc" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.151364 4926 state_mem.go:107] "Deleted CPUSet assignment" podUID="2955f1b9-7c06-4681-b024-65ce46309648" containerName="oc" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.151646 4926 memory_manager.go:354] "RemoveStaleState removing state" podUID="2955f1b9-7c06-4681-b024-65ce46309648" containerName="oc" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.152350 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555684-njhqk" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.155118 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.156099 4926 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.156568 4926 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-24cm5" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.159628 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555684-njhqk"] Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.206676 4926 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlw9x\" (UniqueName: \"kubernetes.io/projected/6291e99f-3a9d-4fa6-a390-7000bc4bb107-kube-api-access-hlw9x\") pod \"auto-csr-approver-29555684-njhqk\" (UID: \"6291e99f-3a9d-4fa6-a390-7000bc4bb107\") " pod="openshift-infra/auto-csr-approver-29555684-njhqk" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.308126 4926 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlw9x\" (UniqueName: \"kubernetes.io/projected/6291e99f-3a9d-4fa6-a390-7000bc4bb107-kube-api-access-hlw9x\") pod \"auto-csr-approver-29555684-njhqk\" (UID: \"6291e99f-3a9d-4fa6-a390-7000bc4bb107\") " pod="openshift-infra/auto-csr-approver-29555684-njhqk" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.329682 4926 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlw9x\" (UniqueName: \"kubernetes.io/projected/6291e99f-3a9d-4fa6-a390-7000bc4bb107-kube-api-access-hlw9x\") pod \"auto-csr-approver-29555684-njhqk\" (UID: \"6291e99f-3a9d-4fa6-a390-7000bc4bb107\") " pod="openshift-infra/auto-csr-approver-29555684-njhqk" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.475234 4926 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555684-njhqk" Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.960026 4926 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555684-njhqk"] Mar 12 18:44:00 crc kubenswrapper[4926]: I0312 18:44:00.966378 4926 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 18:44:01 crc kubenswrapper[4926]: I0312 18:44:01.405611 4926 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555684-njhqk" event={"ID":"6291e99f-3a9d-4fa6-a390-7000bc4bb107","Type":"ContainerStarted","Data":"64143db8c03f28b46b904bc03153011d44e646babf188aa5c3447de2a0718546"} Mar 12 18:44:01 crc kubenswrapper[4926]: E0312 18:44:01.422480 4926 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/openshift4/ose-cli:latest: Manifest does not match provided manifest digest sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 12 18:44:01 crc kubenswrapper[4926]: E0312 18:44:01.422721 4926 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 18:44:01 crc kubenswrapper[4926]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 12 18:44:01 crc kubenswrapper[4926]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hlw9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29555684-njhqk_openshift-infra(6291e99f-3a9d-4fa6-a390-7000bc4bb107): ErrImagePull: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/openshift4/ose-cli:latest: Manifest does not match provided manifest digest sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9 Mar 12 18:44:01 crc kubenswrapper[4926]: > logger="UnhandledError" Mar 12 18:44:01 crc kubenswrapper[4926]: E0312 18:44:01.424352 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/openshift4/ose-cli:latest: Manifest does not match provided manifest digest sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\"" pod="openshift-infra/auto-csr-approver-29555684-njhqk" podUID="6291e99f-3a9d-4fa6-a390-7000bc4bb107" Mar 12 18:44:02 crc kubenswrapper[4926]: E0312 18:44:02.415462 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29555684-njhqk" podUID="6291e99f-3a9d-4fa6-a390-7000bc4bb107" Mar 12 18:44:09 crc kubenswrapper[4926]: I0312 18:44:09.490338 4926 scope.go:117] "RemoveContainer" containerID="79944b20ee2855a4a0a1dba29df2c15482bb6976a2ebdd8437759ebdec65cbd2" Mar 12 18:44:09 crc kubenswrapper[4926]: E0312 18:44:09.491477 4926 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hmdg8_openshift-machine-config-operator(f7b34559-da2f-4796-8f3f-c56b2725c464)\"" pod="openshift-machine-config-operator/machine-config-daemon-hmdg8" podUID="f7b34559-da2f-4796-8f3f-c56b2725c464"